Hostname: page-component-cb9f654ff-5jtmz Total loading time: 0 Render date: 2025-08-27T07:33:21.883Z Has data issue: false hasContentIssue false

Dynamic eye-tracking on large screens: a 3D printed adjustable guide rail platform

Published online by Cambridge University Press:  27 August 2025

Shivam Acharya
Affiliation:
Pennsylvania State University, USA
Lingyun He*
Affiliation:
Pennsylvania State University, USA
Farnaz Tehranchi
Affiliation:
Pennsylvania State University, USA

Abstract:

This paper provides a design solution to the existing problem of using eye trackers for large screens. Traditional eye trackers are limited to commercial and smaller-sized screens. However, as larger screens become increasingly popular and essential for various tasks, their impact needs further investigation in user performance and behavioral studies. This work introduces a design approach for adjustable guide rail system to make moving an eye tracker along with the user's head position possible. The testing results showcase robust, accurate and functions under varying real-world conditions, making it ideal for Human-Computer Interaction and User Experience Research. The Guide Rail design employed by this system is easy to manufacture and incorporates 3D printed parts making it easily reproducible and open for customization.

Information

Type
Article
Creative Commons
Creative Common License - CCCreative Common License - BYCreative Common License - NCCreative Common License - ND
This is an Open Access article, distributed under the terms of the Creative Commons Attribution-NonCommercial-NoDerivatives licence (http://creativecommons.org/licenses/by-nc-nd/4.0/), which permits non-commercial re-use, distribution, and reproduction in any medium, provided the original work is unaltered and is properly cited. The written permission of Cambridge University Press must be obtained for commercial re-use or in order to create a derivative work.
Copyright
© The Author(s) 2025

1. Introduction

Eye tracking, which is defined as the accurate measurement of a person's gaze orientation, is an important tool for understanding human visual attention, cognitive processes, and interaction input. Gaze is the external observation index of human visual attention (Reference Zhiwei and QiangZhiwei & Qiang, 2005). Eye-tracking technology mainly relies on primarily uses cameras and infrared (IR) illuminators to systems to work. These devices track a glint in the eye by capturing the eye's reflection of infrared light, using what is known as the Pupil Centre Corneal Reflection (PCCR) method (Reference Guestrin and EizenmanGuestrin & Eizenman, 2006). Eye-tracking devices can be divided into two broad categories: intrusive trackers and non-intrusive trackers. Intrusive trackers require users to wear special devices, such as reflective contact lenses or VR headsets. These trackers are highly accurate, maintain a fixed distance from the eye, and allow the user to move the head in a high range (Reference David, Gutiérrez, Vo, Coutrot, Silva and CalletDavid et al., 2023; Reference Namnakani, Sinrattanavong, Abdrabou, Bulling, Alt and KhamisNamnakani et al., 2023). For non-intrusive trackers, sensor unit is fixed at a distance and can be divided into short distance (50-70 cm), medium distance (100-200 cm), and long distance (more than 300 cm) depending on their operating distance (Reference Cho and KimCho & Kim, 2013). Such devices are suitable for large-scale applications because the user does not need to wear any device (Reference Namnakani, Sinrattanavong, Abdrabou, Bulling, Alt and KhamisNamnakani et al., 2023). One of the most popular eye tracking methods is screen-based eye-tracking systems. By tracking how users interact with visual information on the screen, researchers are able to optimize content, enhance the user experience, and enhance visual communication (Reference Byrne, Castner, Kastrati, Płomecka, Schaefer, Kasneci and BylinskiiByrne et al., 2023; Reference David, Gutiérrez, Vo, Coutrot, Silva and CalletDavid et al., 2023). Eye tracking applications can also extend beyond interface design to security domains such as deception detection (Reference Wu, Singh, Davis and SubrahmanianLi et al.; Wu et al., 2018) using physiological data (Reference Hypšová, Seitl, Popelka and DostálHypšová et al., 2024) and phishing attack vulnerability assessment (Reference HusseinHussein, 2023).

Figure 1. (a) Head yaw, roll and pitch movements significantly affect eye-trackers' accuracy in larger screens (Reference Rozali, Fadilah, Shariff, Zaini, Karim, Wahab and ShibghatullahRozali et al., 2022), (b) Gazepoint GP3 HD eye-tracker, and (c) LG 47-inch LED HDTV used for testing

Modern eye-tracking technology, despite its rapid advancement, still faces several technical bottlenecks. The primary challenge stems from measurement uncertainties caused by the distance between tracking devices and users' eyes. Lighting variations, head movements, and individual differences in eye anatomies all affect tracking performance (Reference David, Gutiérrez, Vo, Coutrot, Silva and CalletDavid et al., 2023; Reference Namnakani, Sinrattanavong, Abdrabou, Bulling, Alt and KhamisNamnakani et al., 2023). Current technology can only support head movements within a limited range, particularly struggling with tracking vertical and depth movements effectively. To address these challenges, researchers are actively developing machine learning-based algorithms to enhance system stability (Byrne, Reference Byrne2012; Byrne et al., Reference Byrne, Castner, Kastrati, Płomecka, Schaefer, Kasneci and Bylinskii2023; David et al., Reference David, Gutiérrez, Vo, Coutrot, Silva and Callet2023), while coordinate transformation techniques have proven effective in related tracking domains (Reference Jiang, Liu, Jiang, Zheng, Jin and XuJiang et al., 2024). State-of-the-art technology combines depth cameras and pan-tilt-zoom sensors to expand tracking range and improve head and eye movement capture capabilities (Cho & Kim, Reference Cho and Kim2013; Cullipher et al., Reference Cullipher, Hansen and VandenPlas2018; Schwind et al., Reference Schwind, Pohl, Bader, In, Martin, Sarah and Niels2015).

This research proposes a novel eye-tracking solution specifically designed for large-screen applications. Eye tracking on large screens is critical to understanding how people interact with complex setups such as air traffic control interfaces (Reference Wang, Wang, Lin, Cong, Xue and OchiengWang et al., 2021). It provides valuable insights to optimize layouts, improve usability, and ensure ergonomic designs. Existing, cost-effective eye-tracking products are typically limited to displays of 27 inches, beyond which trackers lose accuracy or cannot detect users’ eyes (Tobii, n.d.). Human vision is divided into 3 zones, focus, field of vision, and field of peripheral vision. The field of vision is the portion of the visual environment that is visible to us in a 3-dimensional view. The horizontal field of view is approximately 60° (Reference Rutkowski and MayRutkowski & May, 2017). With screens and trackers operating at 24–25 inches away from the human eye, a 27-inch screen is the maximum size that can be accommodated in this horizontal field of view. To address this limitation in the current products, we have designed an adjustable guide rail with broader coverage by allowing the tracking unit to move along a track. This research focuses on the key design elements of this rail system.

While current eye-trackers in the market utilize fixed designs, and previous research has employed pan-tilt-zoom cameras to enhance sensor unit adaptability (Reference Cho and KimCho & Kim, 2013), our approach further explores the possibilities of a moving camera. This breakthrough design not only overcomes the limitations of static cameras but also contributes to the development of more versatile eye-tracking solutions. Through this innovation, we can support larger display screens while providing users with a greater scope of head movement (Reference Byrne, Castner, Kastrati, Płomecka, Schaefer, Kasneci and BylinskiiByrne et al., 2023; Reference David, Gutiérrez, Vo, Coutrot, Silva and CalletDavid et al., 2023).

This paper outlines a classical 5-stage design thinking approach.

Empathize – Understanding the User and the Problem Space: This stage focuses on gaining a deep understanding of eye tracking and its applications. We spent multiple weeks reading about and testing various screen-based eye-tracking technologies, identifying their limitations. Our findings revealed that static trackers struggle with accuracy on large screens due to restricted tracking range. While many commercial trackers claim high accuracy, real-world testing showed significant gaps, especially in large display setups. This phase also made it clear that industries like gaming, research, and assistive technology have different tracking needs, reinforcing the need for a flexible solution. As designers and engineers, we often default to conventional, one-size-fits-all solutions when confronted with challenges. This approach, though efficient in some cases, can overlook the unique needs and complexities of specific problems. The 5 stages of Design Thinking provide a structured framework that encourages deeper exploration. By empathizing with users, we can uncover insights that standard solutions are missing, allowing for more targeted and effective interventions. This process emphasizes the importance of understanding the problem in its entirety, leading to innovative solutions that are both user-centered and contextually relevant.

Define – Clearly Identifying the Main Problem: This step involves articulating the specific challenge faced in large-screen, non-intrusive eye tracking. The Introduction and Related Work section defines the problem, explaining why existing trackers fail and why a solution is necessary. We analyze prior research to establish the need for a dynamic tracking approach. A major takeaway was that most previous studies focused on small screens and static setups, meaning there was not much guidance on designing for large screen with movement. Clearly defining the problem early on, it also helped avoid unnecessary design detours, as some of our initial ideas did not fully address the main tracking issue.

Ideate – Exploring Possible Solutions: The ideation phase focuses on brainstorming and testing concepts to address the problem. The Methodology section up to sub-section 3.4 details our experiments with the eye tracker’s real-world range and movement feasibility. We explored whether a moving tracker could improve accuracy and tested various implementation methods. One of the biggest lessons here was how useful quick, low-cost testing can be—before even building a prototype, simple setups with manual movement helped validate the idea. We also realized that some potential solutions looked great in theory but did not hold up in real-world testing, reinforcing the need for an iterative approach.

Prototype – Developing a Working Model: This step involves creating a functional prototype based on the chosen concept. Sub-sections 3.5 and 3.6 describe our process of designing and manufacturing a 3D-printed PLA guide rail system to enable controlled tracker movement. The updated protocol also includes a motor control mechanism, potentially illustrated in Figure 4. A key challenge here was balancing stability and smooth movement—early versions either introduced too much vibration or restricted flexibility. We went through multiple iterations, adjusting materials, positioning, and alignment to refine the system and get it working reliably.

Test – Evaluating Performance and Validating Accuracy: The testing phase assesses whether the prototype effectively improves eye-tracking accuracy. Our calibration test confirmed that a moving tracker enhances coverage and accuracy compared to a static system. However, additional tests are needed to evaluate its applicability in research and industrial settings. One lesson from this stage was that testing should be planned alongside prototyping, since we initially underestimated the complexity of calibrating a moving tracker. We also realized that factors like ease of setup, recalibration speed, and software integration are just as important as raw accuracy when designing something for real-world use.

2. Related work

Eye-tracking research has evolved significantly, encompassing various technological approaches and applications. This section reviews key developments in low-cost solutions, applications across different fields, and challenges specific to screen-based eye tracking. Recent years have seen progress in developing affordable eye tracking alternatives. Fuhl et al., (Reference Fuhl, Tonsen, Bulling and Kasneci2016) conducted comprehensive evaluations of low-cost systems, developing an algorithm that demonstrated superior performance compared to existing methods, while noting the inevitable accuracy trade-offs compared to premium devices. A significant breakthrough came from Papoutsaki et al. (Reference Papoutsaki, Sangkloy, Laskey, Daskalova, Huang and Hays2016), who developed WebGazer, a webcam-based solution enhanced through user interaction. Building on this work, Steil et al. (Reference Siegle2019) demonstrated that cost-effective solutions could maintain acceptable data quality while significantly improving accessibility.

The fundamental architecture of screen-based eye tracking involves specialized hardware and software systems designed to monitor user gaze during screen interaction (Reference Poole and BallJacob & Karn, 2003; Poole & Ball, 2006). Most modern systems employ the Pupil Center Corneal Reflection (PCCR) technique, which combines IR illumination and camera capture to calculate precise gaze points (Reference Holmqvist, Nyström, Andersson, Dewhurst, Jarodzka and van de WeijerHolmqvist et al., 2011). The accuracy of these systems heavily depends on proper calibration procedures throughout tracking sessions (Reference Papoutsaki, Sangkloy, Laskey, Daskalova, Huang and HaysPapoutsaki et al., 2016).

The increasing prevalence of large displays has introduced new challenges to eye-tracking technology (Reference Lee, Lee, Cho, Gwon, Park, Lee and ChaLee et al., 2013). Current tracking systems face significant limitations, typically supporting screens only up to 27 inches, beyond which they encounter issues such as dead zones or compromised tracking accuracy (Using an eye-tracker on a larger screen than recommended., 2016). Head movement poses a particular challenge, affecting both calibration stability and tracking precision (Reference Zhiwei and QiangZhiwei & Qiang, 2005). Researchers have proposed various solutions to address these limitations, including dynamic head mapping algorithms (Reference Zhu and JiZhu & Ji, 2007), multi-camera configurations (Reference Sheng-Wen and JinSheng-Wen & Jin, 2004), and innovative Pan-Tilt-Zoom camera systems (Reference Cho, Yap, Lee, Lee and KimCho et al., 2012). The persistent challenges remain, particularly in large-screen applications. Therefore, our proposed work builds upon these foundations while addressing the specific challenges of large-screen eye tracking through innovative mechanical solutions. Most existing large-screen eye-tracking systems that use pan-tilt cameras attempt to compensate for the limitations of static trackers by dynamically adjusting the sensor’s position. These setups rely on motorized PTZ cameras that physically move to keep the user’s eyes within the tracking range as they shift their gaze. Cho & Kim (Reference Cho and Kim2013) proposed a long-range gaze tracking system using a PTZ camera, allowing continuous eye tracking even with significant head movements. Ohno & Mukawa (2004) explored a stereo-camera setup to triangulate gaze position, providing a wider tracking range. However, PTZ-based setups still struggle to compensate for large head movements, particularly in close-range interactions. To maintain tracking accuracy, these systems require the user to sit farther away from the screen, often at a distance that is impractical for real-world applications where users need to interact closely with on-screen content. This limitation makes PTZ cameras less effective for setups where natural head movements and close-range screen interaction are essential.

3. Designing an eye-tracking device

This section describes the design and implementation of our proposed dynamic eye-tracking system for large screens. We first analyze current market limitations, then propose our solution's components and testing methodology in detail. The ideation phase focused on figuring out how to expand the tracker's effective range without losing accuracy. Early tests involved manually repositioning the tracker to different points along the screen to see if localized calibration improved tracking. This led to the idea of a moving platform that could dynamically shift the tracker’s position to match different sections of the screen. Various movement mechanisms were considered, including motorized rails, articulated arms, and fixed-position multi-tracker setups, but each had trade-offs in complexity, cost, and usability. The final concept prioritized a simple, mechanically controlled guide rail that allowed precise repositioning without the need for excessive recalibration or expensive additional hardware.

3.1. System requirements analysis

Given the challenge of a limited viewing window, it became evident that covering the entire screen would either require multiple trackers or a single tracker capable of moving. Most commercial eye-trackers are static and optimized for 24–27-inch screens, with recommended viewing distances of 24-25 inches (Reference David, Gutiérrez, Vo, Coutrot, Silva and CalletDavid et al., 2023). When used with larger displays, these devices face significant challenges in maintaining accuracy, particularly due to increased head movements required to view the entire screen (Reference Namnakani, Sinrattanavong, Abdrabou, Bulling, Alt and KhamisNamnakani et al., 2023). Our preliminary testing revealed that accuracy decreased significantly on-screen edges due to head yaw, roll and pitch movements affecting eye visibility (see Figure 1a).

3.2. Hardware components

For our design prototype, we selected the Gazepoint GP3 HD eye-tracker (shown in Figure 1b) due to its compact size (125g), high accuracy (0.5-1 degree), and robust performance (Gazepoint Control User Manual Rev 2.0., 2019). Testing was conducted on an LG 47-inch LED HDTV (model 47LB5900, shown in Figure 1c), providing sufficient screen area beyond the tracker's optimal 24-inch window.

3.3. Screen setup

The screen used for the experiment is an LG 47-inch Class LED HDTV. Placing the tracker below the center of the screen and conducting multiple calibration cycles showed that the GP3 HD failed to calibrate most iterations. For two iterations, with the head still and eyes strained toward screen corners, tracking worked only within the 24-inch area above the device. Beyond this window, accuracy decreased significantly on the right and left sides of the screen, as head-yaw movements caused one eye to become less visible. Accuracy also diminished towards the top of the screen due to head pitch movements limiting eye visibility. Our software solution utilizes virtual display technology through Amyuni Technologies' secondary desktop package and OBS Studio for precise calibration (Reference SiegleSiegle, 2024). This setup enables the projection of calibrated 24-inch windows onto specific screen segments, which can be individually calibrated, as illustrated in Figure 2. This allows us to check the tracker calibration for different sections of the screen.

Figure 2. Window projection on a large screen can be adjusted using Gazepoint Control: (a) The native Gazepoint control software, (b) Open Broadcaster Software (OBS Studio), and (c) Small screen projection on a large screen

3.4. Design evolution and testing

Initial testing of our guide rail system showed that parallel horizontal movement alone was insufficient for accurate eye tracking when testing coverage for the lower half of the screen (Figure 3a). The tracker failed to capture proper eye contact when the tracker moved solely in parallel to the screen. We then implemented the tracker oriented to face the user (Figure 3b) and designed to maintain a constant distance from the user's eyes (Reference Brand, Diamond, Thomas and Gilbert-DiamondBrand et al., 2021). The optimal configuration was achieved with the tracker positioned at the height of 9 cm and inclining it at a pitch of 32 degrees as it is allowed to face the user in the test environment perfectly. This circular path, with the user's eyes as the centre of rotation, enabled tracking across the entire lower half of the 47-inch screen. A height of 9 inches with a pitch inclination of 22 degrees would accurately track the upper half of the screen. The circular path required for the tracker movement is illustrated as the tracker path (Figure 3c). The tracker path is determined by the human field of view and iterating tracker distance from the screen for the given dimensions in the test environment.

Figure 3. Design evolution: (a) Tracker position testing, (b) Tracker inclined to face the user, and (c) Tracker position and dynamic path based on field of view

3.5. Guide rail system design

The final design incorporates several innovative elements: (1) T section Base: It follows the curved trajectory as seen in Figure 4a, labelled in the tracker path. (2) Compliant Mechanism: Utilizes flexible beam-groove mechanisms (Reference Ma and ChenMa & Chen, 2015) for vertical adjustment. The compliant snap-fit mechanism holds pieces in place. (3) Torque Hinge Integration: Provides precise pitch angle control and supports the weight of the tracker for maintaining eye contact.

Figure 4. CAD model showing: (a) Guide rail base, Sleeve with compliant mechanisms and compliant snap-fit buckle, (b) Horizontal Rail and Final assembly

3.6. Manufacturing implementation

The prototype was manufactured using PLA 3D printing, chosen for its ability to produce complex geometries and appropriate mechanical properties such as tensile strength (Reference MayyasMayyas, 2022). It was carefully considered to optimize the performance of compliant mechanisms and prevent layer de-lamination (Reference Nugroho, Ardiansyah, Isna and LarasatiNugroho et al., 2018; Reference van der Borg, Warner, Ioannidis, van den Bogaart and Roosvan der Borg et al., 2023). The integrated system successfully tracks gaze across the entire 47-inch display by combining mechanical movement with precise calibration capabilities. Figure 5 shows the completed prototype of the tracking assembly.

Figure 5. Final prototype demonstration showing dynamic tracking capabilities

4. Analysis and findings

The objective of developing this device is to assess tracking across all screen regions accurately. We developed a comprehensive testing approach to quantify tracking accuracy across different screen sections.

4.1. Validation methodology

While the Gazepoint software provides native calibration capabilities, it lacks quantitative accuracy metrics. Therefore, we developed a secondary validation test using virtual display technology (Reference SiegleSiegle, 2024) to divide the 47-inch screen into six 24-inch sections (as shown in Figure 6a). For each section, users were positioned 25-26 inches from the screen and did a 5-point calibration process.

4.2. Testing protocol

The designed validation test required users to fixate on a red cross following a 5-point pattern (Figure 6b), maintaining focus for 10 seconds at each position. We defined Areas of Interest (AOIs) around each fixation point (Figure 6c): an inner AOI for precise targeting and an outer AOI for acceptable accuracy (Reference Punde, Jadhav and ManzaPunde et al., 2017). For testing purposes, only co-authors participated in this testing protocol to evaluate the feasibility of the proposed system, an iterative testing process was followed to validate accuracy. The test scenario was designed to resemble the calibration phase of the eye-tracker.

Figure 6. (a) Division of 47-inch screen into six 24-inch calibration sections is used for the validation methodology, (b) The testing protocol has five-point validation test patterns for each pattern two inner, and (c) Outer AOIs definition

4.3. Accuracy analysis

Accuracy was measured using two primary metrics: Fixation Precision Ratio (Number of fixations in inner AOI versus total fixations) and Time Precision Ratio (Viewing duration in inner AOI versus total viewing time). Results across all six screen sections demonstrated consistently high accuracy. The comprehensive results (Table 1) indicate that our dynamic tracking system maintains high accuracy across the entire 47-inch screen, with most measurements achieving precision ratios above 0.9. The results suggest that the system provides reliable gaze-tracking data across the entire screen area, making it suitable for large-screen applications.

Table 1. Accuracy of the left section, the centre section and the right section for top and bottom test results

5. Conclusions and future work

In this work, our primary focus was on assessing the feasibility and viability of our approach, specifically by testing the calibration phase. Since a failure at this stage would render further research impractical, we prioritized this aspect. However, we recognize the importance of broader testing and plan to design a study incorporating behavioural research questions to explore our design further.

Following the 5-stage design thinking process, we saw firsthand that design thinking is not strictly linear period, we had to review earlier stages multiple times based on new findings. One of the biggest takeaways was the importance of validating core assumptions early instead of waiting until the prototyping stage to figure out feasibility. This project also reinforced the need for a user-centered approach, since considering different industries helped refine our final design decisions. For anyone following a similar process, iterating quickly and testing often will help avoid wasted effort and lead to a more effective final solution.

We designed a prototype of an adjustable platform for the Gazepoint GP3 HD tracker that demonstrates the potential for large-screen applications while highlighting several areas for future development. Our work addresses the need for accurate and non-intrusive eye tracking on large screens, a challenge current static systems fail to effectively overcome. Focusing on a user-centered design ideology, the solution addresses adaptability and user comfort in diverse scenarios. Through iterative testing, the prototype demonstrated accurate performance, validating the concept of dynamic gaze tracking. Our approach not only resolves existing limitations but also aligns with the design principles of scalability and modularity, ensuring that the system can evolve with future technological advancements. The result contributes to areas such as education, accessibility, and gaming, offering a robust foundation for more interactive and inclusive user experiences.

This study focused on validating the mechanical feasibility of the guide rail design. While motorization was considered for future iterations to enable dynamic adjustments, compliant structures were used for position holding to simplify initial testing. By proving the guide rail’s effectiveness, future developments can integrate motorization, building on this foundation to optimize performance and enable precise position tracking. Future iterations could incorporate automation through stepper motors for precise motion control (Reference Bendjedia, Amirat, Walther and BerthonBendjedia et al., 2012), alongside a central wide-angle camera for face detection and tracker positioning, building on existing pan-tilt-zoom camera approaches (Reference Cho and KimCho & Kim, 2013). The persistent challenge of head movement-induced accuracy degradation in static eye trackers (Hermens, 2015) could be addressed through dynamic head mapping techniques (Reference Zhu and JiZhu & Ji, 2007) or recent innovations in Kappa-angle-based automatic calibration systems (Reference Liu, Deng, Xu, Li, Zheng, Zhang and LiuLiu et al., 2023).

While alternatives like glasses-based tracking (Reference Niehorster, Hessels and BenjaminsNiehorster et al., 2020) and VR headsets (Reference Clay, König and KoenigClay et al., 2019) can address head movement issues with large screens, and VR environments offer controlled research conditions (Reference Blascovich, Loomis, Beall, Swinth, Hoyt and BailensonBlascovich et al., 2002; Li et al., Reference Li, Lee, Feng, Trappey and Gilani2021), screen-based tracking remains essential in specific industries. New methods also have been developed to incorporate VR headsets in collecting eye movements and transferring 2D experiments to 3D to collect more accurate data and skip the large screen and multi-screen environment problem (Reference Bagherzadeh and TehranchiBagherzadeh & Tehranchi, 2024a, Reference Bagherzadeh and Tehranchi2024b) but their applications need to be investigated.

This research provides a foundation for future developments in dynamic eye-tracking systems, particularly for applications where traditional static trackers prove insufficient. The findings suggest that despite emerging alternatives, screen-based systems maintain their importance in specific industrial applications, validating further development of dynamic tracking solutions. Sectors such as air traffic control and mining rely on traditional display interfaces due to safety requirements and existing infrastructure equipment, showing the continued relevance of our dynamic tracking solution. This solution, because it is based on 3D printing, is easy to manufacture and incorporates 3D printed parts making it easily reproducible and open for customization.

References

Bagherzadeh, A., & Tehranchi, F. (2024a). Computer-based experiments in VR: A virtual reality environment to conduct experiments, collect participants data and cognitive modeling in VR. Proceedings of the 22nd International Conference on Cognitive Modelling, Tilburg, Netherlands.Google Scholar
Bagherzadeh, A., & Tehranchi, F. (2024b). Extending VRAT: From 3D eye tracking visualization to enabling ACT-R to interact with virtual reality environments. Proceedings of 17th International Conference on Social Computing, Behavioral-Cultural Modeling & Prediction and Behavior Representation in Modeling and Simulation, Pittsburgh, PA.CrossRefGoogle Scholar
Bendjedia, M., Ait-Amirat, Y., Walther, B., & Berthon, A. (2012). Position control of a sensorless stepper motor. IEEE Transactions on Power Electronics, 27(2), 578587. https://doi.org/10.1109/TPEL.2011.2161774 CrossRefGoogle Scholar
Blascovich, J., Loomis, J., Beall, A. C., Swinth, K. R., Hoyt, C. L., & Bailenson, J. N. (2002). Immersive virtual environment technology as a methodological tool for social psychology. Psychological Inquiry, 13(2), 103124. https://doi.org/10.1207/S15327965PLI1302_01 CrossRefGoogle Scholar
Brand, J., Diamond, S. G., Thomas, N., & Gilbert-Diamond, D. (2021). Evaluating the data quality of the Gazepoint GP3 low-cost eye tracker when used independently by study participants. Behavior Research Methods, 53(4), 15021514. https://doi.org/10.3758/s13428-020-01504-2 CrossRefGoogle Scholar
Byrne, M. D. (2012). Unified theories of cognition. Wiley Interdisciplinary Reviews: Cognitive Science, 3(4), 431438.CrossRefGoogle Scholar
Byrne, S. A., Castner, N., Kastrati, A., Płomecka, M. B., Schaefer, W., Kasneci, E., & Bylinskii, Z. (2023). Leveraging eye tracking in digital classrooms: A step towards multimodal model for learning assistance. Proceedings of the 2023 Symposium on Eye Tracking Research and Applications, Tübingen, Germany. https://doi.org/10.1145/3588015.3589197 CrossRefGoogle Scholar
Cho, D. C., & Kim, W. Y. (2013). Long-range gaze tracking system for large movements. IEEE Transactions on Biomedical Engineering, 60(12), 34323440. https://doi.org/10.1109/TBME.2013.2266413 CrossRefGoogle Scholar
Cho, D. C., Yap, W. S., Lee, H., Lee, I., & Kim, W. Y. (2012). Long range eye gaze tracking system for a large screen. IEEE Transactions on Consumer Electronics, 58(4), 11191128. https://doi.org/10.1109/TCE.2012.6414976 CrossRefGoogle Scholar
Clay, V., König, P., & Koenig, S. (2019). Eye tracking in virtual reality. Journal of Eye Movement Research, 12. https://doi.org/10.16910/jemr.12.1.3 CrossRefGoogle Scholar
Compare Tobii screen-based eye trackers. (n.d.). Retrieved May 20, from https://www.tobii.com/products/eye-trackers/screen-based/compare-eye-trackers Google Scholar
Cullipher, S., Hansen, S. J. R., & VandenPlas, J. R. (2018). Eye tracking as a research tool: An introduction. In Eye Tracking for the Chemistry Education Researcher (Vol. 1292, pp. 19). American Chemical Society. https://doi.org/10.1021/bk-2018-1292.ch001 CrossRefGoogle Scholar
David, E., Gutiérrez, J., Vo, M. L.-H., Coutrot, A., Silva, M. P. D., & Callet, P. L. (2023). The Salient360! toolbox: Processing, visualising and comparing gaze data in 3D. Proceedings of the 2023 Symposium on Eye Tracking Research and Applications, Tübingen, Germany. https://doi.org/10.1145/3588015.3588406 CrossRefGoogle Scholar
Fuhl, W., Tonsen, M., Bulling, A., & Kasneci, E. (2016). Pupil detection for head-mounted eye tracking in the wild: An evaluation of the state of the art. Machine Vision and Applications, 27(8), 12751288. https://doi.org/10.1007/s00138-016-0776-4 CrossRefGoogle Scholar
Gazepoint Control User Manual Rev 2.0. (2019). Clemson University. http://andrewd.ces.clemson.edu/courses/cpsc412/manuals/Gazepoint%20Control.pdf Google Scholar
Guestrin, E. D., & Eizenman, M. (2006). General theory of remote gaze estimation using the pupil center and corneal reflections. IEEE Transactions on Biomedical Engineering, 53(6), 11241133. https://doi.org/10.1109/TBME.2005.863952 CrossRefGoogle Scholar
Holmqvist, K., Nyström, M., Andersson, R., Dewhurst, R., Jarodzka, H., & van de Weijer, J. (2011). Eye tracking: A comprehensive guide to methods and measures.Google Scholar
Hussein, N. (2023). Eye-tracking in association with phishing cyber attacks: A comprehensive literature review. Learning, 10(11).CrossRefGoogle Scholar
Hypšová, P., Seitl, M., Popelka, S., & Dostál, D. (2024). Infrared thermal imaging and eye-tracking for deception detection: A laboratory study. Current Psychology, 43(43), 3323933251.CrossRefGoogle Scholar
Jacob, R. J. K., & Karn, K. S. (2003). Eye tracking in human-computer interaction and usability research: Ready to deliver the promises. In, J. Hyönä, , Radach, R., & Deubel, H. (Eds.), The Mind's Eye (pp. 573605). North-Holland. https://doi.org/10.1016/B978-044451020-4/50031-1 CrossRefGoogle Scholar
Jiang, T., Liu, L., Jiang, J., Zheng, T., Jin, Y., & Xu, K. (2024). Trajectory tracking using frenet coordinates with deep deterministic policy gradient. arXiv preprint arXiv:2411.13885.Google Scholar
Lee, H. C., Lee, W. O., Cho, C. W., Gwon, S. Y., Park, K. R., Lee, H., & Cha, J. (2013). Remote gaze tracking system on a large display. Sensors, 13(10), 1343913463.CrossRefGoogle Scholar
Li, F., Lee, C. H., Feng, S., Trappey, A., & Gilani, F. (2021, May 57). Prospective on eye-tracking-based studies in immersive virtual reality. 2021 IEEE 24th International Conference on Computer Supported Cooperative Work in Design (CSCWD).CrossRefGoogle Scholar
Li, P., Abouelenien, M., Mihalcea, R., Ding, Z., Yang, Q., & Zhou, Y. (2024). Deception detection from linguistic and physiological data streams using bimodal convolutional neural networks. 2024 5th International Conference on Information Science, Parallel and Distributed Systems (ISPDS).CrossRefGoogle Scholar
Liu, Y., Deng, G., Xu, Z., Li, Y., Zheng, Y., Zhang, Y., & Liu, Y. (2023). Jailbreaking ChatGPT via prompt engineering: An empirical study. arXiv, abs/2305.13860.Google Scholar
Ma, F., & Chen, G. (2015). Modeling large planar deflections of flexible beams in compliant mechanisms using chained beam-constraint-model. Journal of Mechanisms and Robotics, 8(2). https://doi.org/10.1115/1.4031028 CrossRefGoogle Scholar
Mayyas, M. (2022). Design characterization of 3D printed compliant gripper. Meccanica, 57(3), 723738. https://doi.org/10.1007/s11012-022-01474-z CrossRefGoogle Scholar
Namnakani, O., Sinrattanavong, P., Abdrabou, Y., Bulling, A., Alt, F., & Khamis, M. (2023). GazeCast: Using mobile devices to allow gaze-based interaction on public displays. Proceedings of the 2023 Symposium on Eye Tracking Research and Applications, Tübingen, Germany. https://doi.org/10.1145/3588015.3589663 CrossRefGoogle Scholar
Niehorster, D. C., Hessels, R. S., & Benjamins, J. S. (2020). GlassesViewer: Open-source software for viewing and analyzing data from the Tobii Pro Glasses 2 eye tracker. Behavior Research Methods, 52(3), 12441253. https://doi.org/10.3758/s13428-019-01314-1 CrossRefGoogle Scholar
Nugroho, A., Ardiansyah, R., Isna, L., & Larasati, I. (2018). Effect of layer thickness on flexural properties of PLA (PolyLactid Acid) by 3D printing. Journal of Physics: Conference Series, 1130, 012017. https://doi.org/10.1088/1742-6596/1130/1/012017 CrossRefGoogle Scholar
Ohno, T., & Mukawa, N. (2004). A free-head, simple calibration, gaze tracking system that enables gaze-based interaction. Proceedings of the 2004 Symposium on Eye Tracking Research & Applications, San Antonio, Texas. https://doi.org/10.1145/968363.968387 CrossRefGoogle Scholar
Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., & Hays, J. (2016). Webgazer: Scalable webcam eye tracking using user interactions. Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence, New York, NY, USA.Google Scholar
Poole, A., & Ball, L. (2006). Eye tracking in human-computer interaction and usability research: Current status and future prospects. In Encyclopedia of Human Computer Interaction (pp. 211219).Google Scholar
Punde, P. A., Jadhav, M. E., & Manza, R. R. (2017, October 56). A study of eye tracking technology and its applications. 2017 1st International Conference on Intelligent Systems and Information Management (ICISIM).CrossRefGoogle Scholar
Rozali, R., Fadilah, S., Mohd Shariff, A. R., Mohd Zaini, K., Karim, F., Abd Wahab, M. H., & Shibghatullah, A. (2022). Driver drowsiness detection and monitoring system (DDDMS). International Journal of Advanced Computer Science and Applications, 13. https://doi.org/10.14569/IJACSA.2022.0130691 CrossRefGoogle Scholar
Rutkowski, P., & May, C. A. (2017). The peripheral and central Humphrey visual field – Morphological changes during aging. BMC Ophthalmology, 17(1), 127. https://doi.org/10.1186/s12886-017-0522-3 CrossRefGoogle Scholar
Schwind, V., Pohl, N., & Bader, P. (2015). Accuracy of a low-cost 3D-printed head-mounted eye tracker. In, P. Martin, , Sarah, D., & Niels, H. (Eds.), Mensch und Computer 2015 – Tagungsband (pp. 259262). De Gruyter. https://doi.org/10.1515/9783110443929-028 CrossRefGoogle Scholar
Sheng-Wen, S., & Jin, L. (2004). A novel approach to 3-D gaze tracking using stereo cameras. IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics), 34(1), 234245. https://doi.org/10.1109/TSMCB.2003.811128 CrossRefGoogle Scholar
Siegle, D. (2024). Enhancing student video and YouTube streaming with Open Broadcaster Software Studio. Gifted Child Today, 47(1), 6573. https://doi.org/10.1177/10762175231205905 CrossRefGoogle Scholar
Steil, J., Hagestedt, I., Huang, M. X., & Bulling, A. (2019). Privacy-aware eye tracking using differential privacy. Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications, Denver, Colorado. https://doi.org/10.1145/3314111.3319915 CrossRefGoogle Scholar
Using an eye tracker on a larger screen than recommended. (2016). Tobii Help Center. https://help.tobii.com/hc/en-us/articles/209528789-Using-an-eye-tracker-on-a-larger-screen-than-recommended Google Scholar
van der Borg, G., Warner, H., Ioannidis, M., van den Bogaart, G., & Roos, W. H. (2023). PLA 3D printing as a straightforward and versatile fabrication method for PDMS molds. Polymers, 15(6), 1498.CrossRefGoogle Scholar
Wang, Y., Wang, L., Lin, S., Cong, W., Xue, J., & Ochieng, W. (2021). Effect of working experience on air traffic controller eye movement. Engineering, 7(4), 488494.CrossRefGoogle Scholar
Wu, Z., Singh, B., Davis, L., & Subrahmanian, V. (2018). Deception detection in videos. Proceedings of the AAAI Conference on Artificial Intelligence.CrossRefGoogle Scholar
Zhiwei, Z., & Qiang, J. (2005, June 20–25). Eye gaze tracking under natural head movements. 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05).Google Scholar
Zhu, Z., & Ji, Q. (2007). Novel eye gaze tracking techniques under natural head movement. IEEE Transactions on Biomedical Engineering, 54(12), 22462260. https://doi.org/10.1109/TBME.2007.895750 CrossRefGoogle Scholar
Figure 0

Figure 1. (a) Head yaw, roll and pitch movements significantly affect eye-trackers' accuracy in larger screens (Rozali et al., 2022), (b) Gazepoint GP3 HD eye-tracker, and (c) LG 47-inch LED HDTV used for testing

Figure 1

Figure 2. Window projection on a large screen can be adjusted using Gazepoint Control: (a) The native Gazepoint control software, (b) Open Broadcaster Software (OBS Studio), and (c) Small screen projection on a large screen

Figure 2

Figure 3. Design evolution: (a) Tracker position testing, (b) Tracker inclined to face the user, and (c) Tracker position and dynamic path based on field of view

Figure 3

Figure 4. CAD model showing: (a) Guide rail base, Sleeve with compliant mechanisms and compliant snap-fit buckle, (b) Horizontal Rail and Final assembly

Figure 4

Figure 5. Final prototype demonstration showing dynamic tracking capabilities

Figure 5

Figure 6. (a) Division of 47-inch screen into six 24-inch calibration sections is used for the validation methodology, (b) The testing protocol has five-point validation test patterns for each pattern two inner, and (c) Outer AOIs definition

Figure 6

Table 1. Accuracy of the left section, the centre section and the right section for top and bottom test results