Eye Tracking Technique for Controlling Computer Game Objects

Tara Qadir Kaka Muhammad1, Hawar Othman Sharif1, Mazen Ismaeel Ghareb2

1Department of Computer, College of Science, University of Sulaimani, Sulaimani, Iraq, 2Department of Computer Science, College of Science and Technology, University of Human Development, Kurdistan Region, Iraq

Corresponding author’s e-mail: Tara Qadir Kaka Muhammad, Department of Computer, College of Science, University of Sulaimani, Sulaimani, Iraq. E-mail: tara.qadir@univsul.edu.iq
Received: 01-09-2021 Accepted: 30-03-2022 Published: 20-04-2022
DOI: 10.21928/uhdjst.v6n1y2022.pp43-51


ABSTRACT

The study explored the employment of associate in accessible eye tracer with keyboard and mouse input devices for video games. An interactive game has been developed using unity with multiple balls objects and by hitting they could collect more point for each player. It has been used different techniques to hit the balls using mouse, keyboard, and mixed. Eye tracker input has been help to increase the performance of collected the player points. The research explains how the eye tacking techniques can be used in widely in video game and it is very interactive. Finally, we examine the use of visual observation in relevancy the keyboard and mouse input control and show the difference. Our results indicate that the employment of a watch huntsman will increase the immersion of a computer game and considerably improve the video game technology.

Index Terms: Computer Games, Eye Tracking, Eye Gaze Interaction, Facial Expressions as Game Input, Evaluating Peripheral Interaction

1. INTRODUCTION

It has been described two experiments that examine our technique of selecting eye objects using traditional mouse selection. We have already looked at how people behave when interacting with their eyes in the demos. The next step is to show that our method can withstand tougher use and that people like to select objects using their gaze for an extended period.

We rated the overall performance of gaze interaction with that of a widely used and widely used device: The mouse. The interaction of the eye requires different hardware and software. It’s a question of whether it’s worth it. If it works properly, we could also get some secondary benefits that are difficult to quantify through an additional, passive, or mild input channel. For example, we’ve found that when the visual interaction works well, the device senses almost as if it were waiting for user controls. Just like you are studying the mind of the user. You want no more guidance input and you have your palms free from various tasks. It slows down the interaction and can “cover costs” in a simple experimental comparison with the mouse, regardless of the immaturity of current eye tracking technology.

The eye interaction approach is faster, but we see it as an advantage; however, it is not now the primary motivation for using eye monitoring in most environments.

Our experiments have measured the time required to perform simple and representative direct manipulative arithmetic tasks. One asked to select a highlighted circle from a circle grid. The second asked the test person to select the named letter on a loudspeaker from a letter grid. Our results show a wonderful and measurable pacemaker benefit for searching the mouse in the same experimental setting, persevering in every experiment. The key points of the test allow us to understand how our method of visible interaction works and why it is effective. As expected, the method is a little faster than the mouse. Our search suggests that the eye can also go faster than the hand. Our method verification is how our entire interaction method and algorithm preserve this eye speed advantage on a proper object. We study the physiology of the eye and use these records to extract useful facts about the user’s overall intentions from noisy and fearful eye movement data. Even if it does, this algorithm is primarily based on understanding eye movement. It used to be no longer clear that our eye interaction approach would keep pace with the eye, as the eye monitoring hardware entails additional latencies [1]. The overall performance of any interaction science results from its software and hardware program. The previous experiments show that keyboard, mouse and joystick considered as traditional game inputs and there are many new techniques need to be considers [2], [3], [4].

Manipulate the voice [5] and monitor the head [6]. For the past few years, while using various entry level techniques, the goal of researchers has been to find out which technique is most accurate, most immersive, and most convenient for users. Here, we explain the benefits of using eye movements as a game controller. The eye tracking techniques have been compared to mouse input and have showed the results. Eye tracking technology has been shown to increase immersion and make games more fun for the player. As in Ivanchenko et al. [7], Almansouri [8], it shows that eye tracking as game input is very precise. The comparison of mouse, keyboard, and appearance in Jiménez-Rodríguez et al. [9] is also used as an entry level solo controller.

In contrast to these studies, which deal specifically with the comparison between mouse and eye control in terms of precision and effectiveness, many studies have focused on investigating the game experiment. In article [10], you focused on this immersive evaluation and the user experience. However, their research showed that when comparing the game with mouse data, the players were more immersed in the game, but in paper [11], they achieved a reliable questionnaire evaluation and a high score in terms of stamina. Feelings of fluidity and immersion in gaze-controlled play compared to the study by Modi and Singh [10], the result of which had to be further investigated. The way people treat the computer as human-computer interaction (HCI) has user actions on three different levels: Physical, cognitive, and emotional; however, the emotional level is a new topic that not only tries to make the interaction experience pleasant but also affects the further use of the machine by the user [12].

However, to better understand the emotional level at HCI, that is, the user’s involvement in the machine’s use, an evaluation of the user is required for the use of a peripheral device with test emotions. Emotional use of a peripheral device at the same time is difficult. Research [13] examined this primary function that requires continuous interaction and secondary work that takes place on the periphery. Some research on HCI has used emotions as a starting point. In research [14], Bernhaupt et al. designed an emotion flower set and used facial emotions as input. They used positive emotions (joy and surprise) to grow flowers and negative emotions (disgust, anger, sadness, and fear) to slow growth. Their game was intended for the workplace and they understood that their game improved the player’s emotional state while playing, while the game did not affect people’s general mood, but their work has become a fundamental work for Lankes et al. group [15]. His research works alongside the redesign of the emotion flower game; they tried wearing it in a mall and examined the player’s emotional feedback to add more contrast to their basic work. Our approach is to combine existing input techniques such as (mouse, keyboard, and eye tracking). Add more facial emotions (joy, anger, and surprise) than inputs to a main game used in the proposed game [16]. The user can choose one. The game is using all three types of inputs such as (mouse, keyboard and eye tracker). Later, the damaged balloons are collected, and the score is increased. However, the emotion of the face has a peripheral role that helps the user control the speed of the balloons and gets more points in a time [17].

In addition, evaluation and effectiveness are important to us; measure effectiveness by comparing recorded results from different users. However, the assessment is made by comparing the input parameters in two different categories (emotional and unemotional).

2. RELATED WORK

2.1. Review Stage

Since there is extensive research on evaluating emotions and emotions with the help of users, research on emotions is not limited to facial expression, as the ability to become aware of people’s emotions has an impact on social interaction and human behavior [18], moreover, some researchers worked on the body as in Ghareb [19] argue that recognition of emotions through non-verbal communication can be achieved by sensing body expression. However, our research on facial emotions touches a small portion of this area.

Ekman in Chittaro and Sioni [20] defined that facial expression is an example of things one can do through the face, his dialogue focused on the set of facial expressions – happiness, surprise, anger, sadness, worry, and disgust – that are culturally international and on which cultures depend to decide, to show rules. As in Emotion Recognition and its Application in Software Engineering [21], it examines a number of scenarios to assess the possibility of applying emotional cognition strategies in four areas: Software programming, website personalization, school, and games. Video games are contingencies that can dynamically respond to the emotions of the diagnosed contemporary gamer. Massive investigations worked on a specific reenactment for capturing emotions as discussed in Ekman et al. [22] an open-source EVG (Emotion Evoking Game) and a first and formative comparison strange end result ordinary variations were determined by comparison and facial expressions of surprise, joy, and disappointment been. There is a lot of research on recovery that has been used in focusing emotions in deferent approaches [18], [23], [24]. In addition to the emotion, the evaluation of peripheral units is a trend phase that has led to extensive discussions, countless findings focused on a unique type of peripheral evaluation. Some of the systems in the literature were evaluated on the basis of studies of test subjects [25], [26]. In addition, a learnable subject [27] using unique modalities (tangible, tactile, and hands-free); however, for the peripheral interaction, all peripheral works are ultimately a simple interaction, unless the focus is on controlling the audio participant. Collecting the points using user friendly method and easy way for evaluation different technology in the game. Besides working on the contrast of consumer emotions at a certain point in the game, our device focuses on evaluating the input devices (keyboard, mouse, and gaze); some research has observed the usefulness of the gaze as an access system [28], [29], [30], [31], [32], [33]. The rating of the normal mouse with momentary was examined in Almansouri [8], among the recently introduced (adult, middle aged, and the elderly) the rating of the eye over size for middle aged and the elderly; while Jacob [34], the evaluation was equated and the operability, in contrast to the menu resolution selection technique of a developed web browser, was experienced again as an operator together with the component, because it results in the system’s operability. In Roose and Veinott [3], Ivanchenko et al. [7], Sibert and Jacob [35], Murata et al. [36], as a tutorial on how the view can be combined with different input methods. However [9], he worked on using the gaze as a solo input and studied the general performance variations for gaze, mouse, or keyboard for a similar project in the game. Today, the most common structure of the eye tracker is the “corneal reflection” unit of the laptop. These structures place the surroundings of the user’s gaze as a display screen coordinate on a monitor. To decide where the consumer is looking. These structures sing to one or each of the eyes they look at with a digital camera equipped with an infrared (IR) filter. The previous game have configured the input of eye using eye cornea and most be configured near the eye using new camera. Because the user’s corneal floor is roughly spherical, the corneal reflex area remains constant as the user’s eyes move relative to the head, the role of the student in relation to this reflex creates the position of the wearer’s eye. A calibration sequence is used to map eye movements to display screen coordinates. There are also portable structures that are beneficial for ubiquitous computing scenarios. These structures use the same method as computer systems, but the archive view as a coordinate in a digital camera built into the user’s head [37]. There are several games has been research of have performance issues of using keyboard and mouse as an input. By testing eye input versus mouse input in three extraordinary PC games, Smith and Graham [40] concluded that using eye monitoring can guarantee the player an additional immersive journey using the eye tracker in first-person shooters. Each test participant was asked to play the same sport using three specific input techniques: (1) Mouse, keyboard, and eye tracker; (2) mouse and keyboard only; or (3) a console gamepad. The results are not exactly encouraging now, suggesting that the overall performance with the eye tracker was once well below the two different ones. However, Isokoski and Martin attributed these results to the players’ greater experience and expertise. The concern about the players who using typical input methods and when offering alternating input for playing the game it need further training. Other studies came to comparable results. The authors in research [39] created a simple look in which the participant was once asked to remove 25 balloons that were moving across the screen at extraordinary speeds. The participant would move the mouse or the eye tracker over the pointer and remove the balloons with the help of a mouse click. Two prerequisites were tested: With and besides the time limit for completing the task. The results confirmed that besides the time limit, the accuracy and time to complete the task were earlier worse when using the eye tracker than when using a mouse [40]. Performance was once based solely on the percentage of balls that the player wiped out. Michael The paper [43] ended with clearly contrary results. Players also mentioned that the eye tracker used to be exceptionally fun to use. These contradicting research consequences suggest that exercising and the approach to enhancing fair recreation are key factors in achieving a continued satisfactory outcome [44], [45].

2.2. Designing a Proposed Game

The comparison of peripheral interplay requires at least two tasks: A foremost task, which must be the focal point of the participant’s attention, and a secondary task, which needs to be carried out in the periphery. This secondary project is normally a given: The assignment supported using the peripheral gadget being evaluated.

2.2.1. Designing the primary task

The tasks are to tell the player to play with keyboard first, then focused on playing with keyboard and mouse if possible, then calculating the results and timing from the players. How many balls have been crashed and how much point has been collected with timing for each techniques. Later on, it has been concentrated on eye tracking for each ball movement and how can control the speed of the ball to hit and crash as much balls as possible. Most of the players have been collected full points but with more time is needed it.

3. METHODOLOGY

The methodology of this study is to observe the players interaction with the games before and after using eye tracking as an input methods for the game. The players all have experience in gaming which they have training for using this game and how can used eye tracker device. The eye tracking techniques help the players to be more interaction with the game. The observation has been conducted and extracted the data from the player before and after using the eye tracker and illustrates the difference between the results using several statistical measurements.

3.1. Game Scenario

The game is designed and implemented to destroy difference color balloons to collect points. The balloons have been destroyed by three means of inputs, mouse, keyboard, and eye. The main objective of the game is finding the difference efficiency of several inputs such as mouse, keyboard, and eye focused.

3.1.1. Hardware requirements

The game has developed on PC with these requirements, CPU Intel i7-2600, 3.6 GHZ, Ram 4GB, Hard desk 512 GB, and graphic card NVID QUARD 600. Tobii Eye Tracker has been used in this game. The only devices are capable of tracking both head and eye movements for game interaction, exports training, and streaming.

The Tobii 4C eye tracker is the hardware device which will track eye movement in the game. It has a driver define it and then define the eye of the player and later will detect the eye movement in the game and program it in game source code.

Fig. 1 shows the Tobii 4C eye tracker.

thumblarge

Fig. 1. Tobii 4C eye tracker device.

3.1.2. Game design

The figures below have been explained the game interface and game rules how player can use the game and what is the interactive with players. Final figure explain how the interaction between eye tracker with ball movement to score point for the player.

The game has been developed using Unity version 18.3, C# ultimate 2012 and using Microsoft Windows 10 64 bit. Fig. 1 shows the game interface and how can user hit the balloons and collects points. Fig. 2 explains users using mouse and Fig. 3 shows how user can use eye movement to hit the balloons and score point.

thumblarge

Fig. 2. Users select input option for playing the game.

thumblarge

Fig. 3. Mouse input for users.

4. DATA COLLECTION

Collecting data about the player have been conducted selecting 48 users. The player was undergraduate students from computer science department and all have been trained to use the game. Each user has been played with mouse, eye, and eye with space for collecting the points. Table 1 shows the time collected using mouse, eye, or combination of the two.

TABLE 1: Player time difference for different inputs

thumblarge

The results have been showed that 45% eye input performance is better than mouse input and the performance of eye input is 66% which is better performance than combination input. This indicates that the eye input has acceptable values as input for the players.

The linear Pearson’s correlation has been used for this study. We have been used Pearson correlation coefficient (PCC) to measure a liner correlation between eye control dataset and mouse and other two correlations between eye and eye input control. The measure calculates the covariance of two variables and the product of standard deviations. The results are between –1 and 1. The covariance only can reflect the relationships or correlations.

Pearson correlation coefficient = ρ (x,y) = Σ[( [(xi – x̄) * (yi –ȳ)]/( σx*σy) [46]

Table 2 has shown that the PCC [46] between mouse and eye input is 0.44 which indicates that the timing for eye input is related to mouse and has acceptable values as input.

TABLE 2: Pearson correlation between mouse and eye input

thumblarge

Table 3 has shown that the PCC [46] between eye and eye space is 0.54 which indicates that the timing for eye input is related to mouse and has acceptable values as input.

TABLE 3: Pearson correlation between eye and eye space input

thumblarge

Table 4 shows the regression statistics of difference between eye and mouse tracker and has been generated using SPSS statistical tool. All the results indicate significant results for user timing compare to mouse input. These statistical results indicate that eye input has slightly better performance than mouse. This means that game industry can use eye interaction techniques beside mouse input. P values show significant values of eye movement regarding effect of the eye input to the user interaction in the game with mouse input also.

TABLE 4: Regression statistics for eye and mouse inputs

thumblarge

Table 5 shows the regression statistics of difference between eye tracker input and keyboard input; these results have been extracted from SPSS statistical tool. All the results indicate that eyes tracker input slightly has less performance for user timing compare to mouse input. These statistical results indicate that eye tracker input has slightly slower performance than keyboard. This means that game industry can use eye interaction techniques beside mouse input.

TABLE 5: Regression statistics for keyboard and eye tracker input

thumblarge

Descriptive statistics for the three games input for mean, standards error, median, standard deviation, sample variance, and confidence level are explained in Table 6. Fig. 4 has been explain the using of eye tracking as it shows the ball speed has been changes according of eye focus. The results explain better results for eye tracker in some of statically factors. These results indicate that users can use eye tracking and eye tracking with keyboard combination and make it the game more interactive and better performance.

TABLE 6: Descriptive statistics for the three game input mouse, keyboard, and eye tracker

thumblarge
thumblarge

Fig. 4. Eye input for users.

Figs. 5-7 have shown the histogram for speed performance of three different method of game input eye, mouse, and eye keyboard.

thumblarge

Fig. 5. Players eye game input performance.

thumblarge

Fig. 6. Mouse game inputs performance.

thumblarge

Fig. 7. Eye and keyboard input performance.

5. RESULTS

An evaluation of Pearson correlation [46] and regression records used to be carried out on the overall performance measures for every sport to realize any big variations between the two entry modalities. Users carried out visually well. However, no sizable overall performance variations had been found for each mouse and keyboard. For pointing tasks, for example, the consumer will frequently appear at the goal and then go the cursor solely when he picks a target. However, with the eye pointer, the cursor strikes every time the consumer strikes their eyes. These effects in a widespread expand in the quantity of remarks the person receives from the game, even if the person does now not consciously raise out an express action. Users additionally confirmed a robust choice for the eyepiece tracker throughout playback. We suppose this is because of the decreased quantity of effort it takes to go through the persona when the use of the eyes. To entire the task, customers had to make over one cursor moves throughout the whole screen. When the use of the mouse, the consumer appears at the favored goal and then explicitly acts with the mouse to pass the cursor. However, with the eye pointed, truly searching at the favored spot shifted the cursor, casting off the want for any hand motion. A participant commented that, “I ought to discover with the view freely and only clicked on the mouse in case of need.” People naturally use eye actions to factor when speaking with different people and eye monitoring permits the identical visible cues to be prolonged in the digital world. We consider that the distinction in overall performance between the eye tracer and the mouse throughout the sport is because of the latency that took place when taking pictures on a target. Remember, it took about a 2d for a shot to attain the place it was fired. Users regarded to have a hard time getting “led” the missiles through searching out into empty area in the front of the cellular target.

6. CONCLUSION

This paper has focused on introducing new input for video games. Eye movement is one of the important inputs in video games it increases the interactive between the players and makes it more interesting and challenging. The case study has been conducted on 48 players (undergraduate BSc students) to test the game with different input mouse, keyboard, mouse keyboard, and eye movement. The outputs have been shown the significant effect on the playing the game regarding scoring the point using eye input techniques and this adds privilege to using more input and makes the game more interactive. However, the game will slow down but this depends on game scenario. Finally, the results show significant correlation between all inputs eye, mouse, and keyboard.

ACKNOWLEDGMENT

We would like to thank all the opportunity and support University of Human Development for usual support, Head of Computer Science Department of University of Sulaimani and all students of Computer Science Department that participate in playing the game.

REFERENCES

[1]. M. R. Mine. “Virtual environment interaction techniques.”UNC Chapel Hill Computer Science Technical Representative, vol. 18, pp. 1-18, 1995.

[2]. B. Yuan, E. Folmer and F. C. Harris. “Game accessibility:A survey.”Univers. Access Information Society, vol. 10, no. 1, pp. 81-100, 2011.

[3]. K. M. Roose and E. S. Veinott. “Understanding game roles and strategy using a mixed methods approach”. In:ACM Symposium on Eye Tracking Research and Applications. Association for Computing Machinery, New York, United States, pp. 1-5, 2021.

[4]. Z. Li, P. Guo and C. Song. “A review of main eye movement tracking methods.”Journal of Physics:Conference Series, vol. 1802, No. 4, 042066, 2021.

[5]. C. Biele. “Eye movement.”In:Human Movements in Human-Computer Interaction. Springer, Cham, pp. 23-37, 2022.

[6]. A. Goettker and K. R. Gegenfurtner. “A change in perspective:The interaction of saccadic and pursuit eye movements in oculomotor control and perception.”Vision Research, vol. 188, pp. 283-296, 2021.

[7]. D. Ivanchenko, K. Rifai, Z. M. Hafed and F. Schaeffel. “A low-cost, high-performance video-based binocular eye tracker for psychophysical research.”Journal of Eye Movement Research, vol. 14, no. 3, 3, 2021.

[8]. A. S. Almansouri. “Tracking eye movement using a composite magnet.”IEEE Transactions on Magnetics, vol. 58, no. 4, 3152085, 2022.

[9]. C. Jiménez-Rodríguez, L. Yélamos-Capel, P. Salvestrini, C. Pérez-Fernández, F. Sánchez-Santed and F. Nieto-Escámez. Rehabilitation of visual functions in adult amblyopic patients with a virtual reality videogame:A case series. Virtual Reality, vol. 2021, pp. 1-12, 2021.

[10]. N. Modi and J. Singh. “A review of various state of art eye gaze estimation techniques.”In:Advances in Computational Intelligence and Communication Technology. Springer, Germany, 2021, pp. 501-510.

[11]. L. E. Nacke, S. Stellmach, D. Sasse and C. A. Lindley. “Gameplay experience in a gaze interaction game”. arXiv, vol. 2010, pp. 49-54.

[12]. K. Saroha, S. Sharma, G. Bhatia and A. Professor. “Human computer interaction:An intellectual approach.”International Journal of Computer Science and Management Studies., vol. 11, no. 2, 2, 2011.

[13]. S. Bakker, E. Van Den Hoven and B. Eggen. “Evaluating peripheral interaction design.”Human-Computer Interact, vol. 30, no. 6, pp. 473-506, 2015.

[14]. R. Bernhaupt, A. Boldt, T. Mirlacher, D. Wilfinger and M. Tscheligi. “Using emotion in games:Emotional flowers.”ACM International Conference Proceedings Series, vol. 203, pp. 41-48, 2007.

[15]. M. Lankes, S. Riegler, A. Weiss, T. Mirlacher, M. Pirker and M. Tscheligi. “Facial expressions as game input with different emotional feedback conditions.”In:ACE '08:Proceedings of the 2008 International Conference on Advances in Computer Entertainment Technology. Association for Computing Machinery, New York, United States, pp. 253-256, 2014.

[16]. A. Covaci, G. Ghinea, C. H. Lin, S. H. Huang, and J. L. Shih. “Multisensory games-based learning -lessons learnt from olfactory enhancement of a digital board game.”Multimed. Tools Appl., vol. 77, no. 16, pp. 21245-21263, 2018.

[17]. Y. A. Sekhavat and P. Nomani. “A comparison of active and passive virtual reality exposure scenarios to elicit social anxiety.”International Journal of Serious Games, vol. 4, no. 2, pp. 3-15, 2017.

[18]. A. M. Darwesh, M. I. Ghareb. and S. Karimi. “Towards a serious game for kurdish language learning.”Journal of University of Human Development, vol. 1, no. 3, pp. 376-384, 2015.

[19]. M. I. Ghareb. HTML5, future to solve cross-platform issue in serious game development. Journal of University of Human Development, vol. 2, no. 4, pp. 443-450, 2016.

[20]. L. Chittaro and R. Sioni. “Affective computing vs. Affective placebo:Study of a biofeedback-controlled game for relaxation training.”International Journal of Human-Computer Studies, vol. 72, no. 8-9, pp. 663-673, 2014.

[21]. A. Ahmed and M. Ghareb. Design a mobile learning framework for students in higher education. Journal of University of Human Development, vol. 3, no. 1, 288, 2017.

[22]. P. Ekman, W. V. Friesen and P. Ellsworth. Emotion in the Human Face. Elsevier, Netherlands, 1972.

[23]. A. Kołakowska, A. Landowska, M. Szwoch, W. Szwoch and M. R. Wróbel. “Emotion recognition and its applications.”Advances in Intelligent Systems and Computing, vol. 300, pp. 51-62, 2014.

[24]. N. Wang and S. Marsella. “Introducing EVG:An emotion evoking game.”Lecture Notes in Computer Science, vol. 4133, pp. 282-291, 2006.

[25]. A. Landowska and M. R. Wrobel. “Affective reactions to playing digital games.”In:Proceedings-2015 8th International Conference on Human System Interaction. IEEE, United States, pp. 264-270, 2015.

[26]. W. Szwoch. “Model of emotions for game players.”In:Proceedings - 2015 8th International Conference on Human System Interaction. IEEE, United States, pp. 285-290, 2015.

[27]. S. Bakker, E. Van Den Hoven, B. Eggen and K. Overbeeke. “Exploring peripheral interaction design for primary school teachers,”Proceedings 6th International Conference Dedicated to Research in Tangible, Embedded, vol. 1, no. 212, pp. 245-252, 2012.

[28]. S. Bakker, E. Van Den Hoven and B. Eggen. “FireFlies:Supporting primary school teachers through open-ended interaction design.”In:Proceedings 24th Australian Computer Interaction Conference.Association for Computing Machinery, New York, United States, pp. 26-29, 2012.

[29]. D. Hausen, H. Richter, A. Hemme, and A. Butz. “Comparing input modalities for peripheral interaction:A case study on peripheral music control.”Lecture Notes in Computer Science, vol. 8119, no. 3, pp. 162-179, 2013.

[30]. R. J. K. Jacob. “What you look at is what you get:Eye movement-based interaction techniques.”Proceedings ACM, vol. 90, pp. 11-18, 1990.

[31]. R. J. K. Jacob. “No TitleThe use of eye movements in human-computer interaction techniques:What you look at is what you get.”ACM Transactions on Information Systems, vol. 9, pp. 152-169, 1991.

[32]. R. J. K. Jacob. “Eye movement-based human-computer interaction techniques:Toward non-command interfaces”. In:H. R. Harst and D. Hix. (eds.), Advances in Human/Computer Interaction. vol. 4. Hindawi, United Kingdom, pp. 151-190, 1993.

[33]. R. J. K. Jacob. “What you look at is what you get:Using eye movements as computer input.”Proc. Virtual Real. Syst., vol. 93, pp. 164-166, 1993.

[34]. R. J. K. Jacob. Eye Tracking in Advanced Interface Design. In:Virtual Environments and Advanced Interface Design. Vol. 258. Oxford University Press, Inc., Oxford, 288, 1995.

[35]. L. E. Sibert and R. J. K. Jacob. “Evaluation of eye gaze interaction.”In:Conference on Human Factors in Computing Systems Proceedings. Association for Computing Machinery, New York, United States, pp. 281-288, 2000.

[36]. A. Murata, T. Miyake and M. Moriwaka. “Effectiveness of the menu selection method for eye-gaze input system.”Japanese Journal of Ergonomics, vol. 47, no. 1, pp. 20-30, 2011.

[37]. P. Isokoski and B. Martin. “Eye tracker input in first person shooter games.”In:Proceedings 2nd Conference Communication by Gaze Interact. Association for Computing Machinery, New York, United States, pp. 78-81, 2006.

[38]. P. Isokoski, A. Hyrskykari, S. Kotkaluoto and B. Martin. “Gamepad and eye tracker input in FPS games:Data for the first 50 min.”Proceedings 3rd Conference Communication by Gaze Interact. COGAIN, Denmark, pp. 1-5, 2007.

[39]. A. T. Duchowski. Eye Tracking Methodology:Theory and Practice. Springer Verlag, London, UK, 2003.

[40]. J. D. Smith and T. C. N. Graham. “Use of eye movements for video game control.”In:Proceedings of the 2006 ACM SIGCHI International Conference on Advances in Computer Entertainment Technology. ACE, Hollywood, CA, 2006.

[41]. J. Leyba and J. Malcolm. “Eye tracking as an aiming device in a computer game.”In:Course work (CPSC 412/612 Eye Tracking Methodology and Applications by A. Duchowski). Clemson University, Clemson, CA, 2004.

[42]. P. Isokoski and B. Martin. “Eye tracker input in first person shooter games.”In:Proceedings of the 2nd Conference on Communication by Gaze Interaction:Communication by Gaze Interaction-COGAIN 2006 Gazing into the Future. COGAIN, Turin, Italy, pp. 78-81, 2006.

[43]. P. Isokoski, M. Joos, O. Spakov, and B. Martin. “Gaze controlled games.”Universal Access in the Information Society, vol. 8, pp. 323-337, 2009.

[44]. E. Lacorte, G. Bellomo, S. Nuovo, M. Corbo, N. Vanacore and P. Piscopo. “The use of new mobile and gaming technologies for the assessment and rehabilitation of people with ataxia:A systematic review and meta-analysis.”Cerebellum, vol. 20, no. 3, pp. 361-373, 2021.

[45]. M. Dorr, L. Pomarjanschi and E. Barth. “Gaze beats mouse:A case study.”Psychnology Journal, vol. 7, pp. 197-211, 2009.

[46]. B. Jacob, J. Chen, Y. Huang and I. Cohen. “Pearson correlation coefficient.”In:Noise Reduction in Speech Processing. Springer, Berlin, Heidelberg, pp. 1-4, 2009.