Hostname: page-component-76c49bb84f-887v8 Total loading time: 0 Render date: 2025-07-07T01:53:38.529Z Has data issue: false hasContentIssue false

Advancing indoor positioning systems: innovations, challenges, and applications in mobile robotics

Published online by Cambridge University Press:  27 June 2025

Rushikesh A. Deshmukh
Affiliation:
Department of Electronics Engineering, Ramdeobaba University, Nagpur, India
Meghana A. Hasamnis
Affiliation:
Department of Electronics Engineering, Ramdeobaba University, Nagpur, India
Madhusudan B. Kulkarni*
Affiliation:
Department of Electronics and Communication Engineering, Manipal Institute of Technology, Manipal Academy of Higher Education (MAHE) Manipal, India
Manish Bhaiyya*
Affiliation:
Department of Chemical EngineeringRussell Berrie Nanotechnology Institute, Technion, Israel Institute of Technology, Haifa, Israel
*
Corresponding authors: Madhusudan B. Kulkarni; Email: madhusudan.kulkarni@manipal.edu, Manish Bhaiyya; Email: bhaiyya.manush@gmail.com
Corresponding authors: Madhusudan B. Kulkarni; Email: madhusudan.kulkarni@manipal.edu, Manish Bhaiyya; Email: bhaiyya.manush@gmail.com
Rights & Permissions [Opens in a new window]

Abstract

Indoor positioning systems (IPS) are essential for mobile robot navigation in environments where global positioning systems (GPS) are unavailable, such as hospitals, warehouses, and intelligent infrastructure. While current surveys may limit themselves to specific technologies or fail to provide practical application-specific details, this review summarizes IPS developments directed specifically towards mobile robotics. It examines and compares a breadth of approaches that vary across non-radio frequency, radio frequency, and hybrid sensor fusion systems, through the lens of performance metrics that include accuracy, delay, scalability, and cost. Distinctively, this work explores emerging innovations, including synthetic aperture radar (SAR), federated learning, and privacy-aware AI, which are reshaping the IPS landscape. The motivation stems from the’ increasing complexity and dynamic nature of indoor environments, where high-precision, real-time localization is essential for safety and efficiency. This literature review provides a new conceptual, cross-border pathway for research and implementation of IPS in mobile robotics, addressing both technical and application-related challenges in sectors related to healthcare, industry, and smart cities. The findings from the literature review allow early career researchers, industry knowledge workers, and stakeholders to provide secure societal, human, and economic integration of IPS with AI and IoT in safe expansions and scale-ups.

Information

Type
Review Article
Creative Commons
Creative Common License - CCCreative Common License - BY
This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted re-use, distribution and reproduction, provided the original article is properly cited.
Copyright
© The Author(s), 2025. Published by Cambridge University Press

1. Introduction

In an increasingly automated and intelligent world, mobile robots are transforming industries, enhancing services, and redefining human-machine interactions, from optimizing logistics in warehouses to providing critical assistance in healthcare scenarios. These adaptable machines drive far-from-basic advancements across all fields [Reference Semborski and Idzkowski1, Reference Misaros, Stan, Donca and Miclea2]. The indoor positioning system (IPS) supports all of this. Unlike outdoor robots that rely on global navigation satellite systems (GNSS), mobile robots operating indoors must deal with serious challenges – they must navigate complex, dynamic, and often-cluttered environments with precise and reliable performance, multiplied by the need for scalable and effective IPS solutions. Therefore, the steady evolution of these systems creates great promise for intelligent automation inside industries wherein traditional navigation systems would fail [Reference Misra, Agrawal, Misra, Hussain and Di Sia3, Reference Tong, Liu and Zhang4].

Indoor positioning systems represent a group of technologies that estimate the location and orientation of a mobile robot with respect to its environment. Indoor positioning systems support increasing robots’ autonomy and providing various applications in manufacturing, retail, public safety, and smart infrastructure. However, the series involved in achieving reliable indoor positioning is filled with challenges [Reference Basiri, Lohan, Moore, Winstanley, Peltola, Hill, Amirian and Figueiredo e Silva5, Reference Hailu, Guo, Si, Li and Zhang6]. Obstruction by signals, multi-path effects, variations in environmental conditions, and differences in deployment scenarios all create an extremely challenging scope for researchers and engineers alike. Such constraints drove IPS from simple systems based on visual markers or radio frequency tags toward high-level solutions that involve fusion of multiple sensors, machine-learning algorithms, and advanced computational models [Reference Tang, Zhou, Zhong, Liu and Li7, Reference Fan, Sun, Sun, Wu and Zhuang8]. One of IPS’s most attractive benefits is its interdisciplinary nature that bridges robotics, computer vision, signal processing, and artificial intelligence (AI). For example, the Light Detection and Ranging (LiDAR) and camera-based simultaneous localization and mapping (SLAM) systems utilize the latest vision algorithms to create detailed maps of their environment so that robots can traverse it very precisely [Reference Chghaf, Rodriguez and Ouardi9, Reference Zhu, Li and Zhang10]. Also, some radio-frequency technologies, such as Ultra-Wideband (UWB) Radio-Frequency Identification (RFID), have led to the rise of low-cost, scalable solutions becoming more common in logistics and retail environments. However, within these approaches lie the respective advantages and disadvantages, leading to a constant motivation for innovation, allowing for the conclusion of unsolved problems [Reference Al-Okby, Junginger, Roddelkopf and Thurow11].

The demand for indoor localization is more challenging than outdoor localization due to the’ generally rather dynamic and unpredictable nature of indoor environments. While GPS signals can operate comparatively smoothly in open spaces, their effectiveness in indoor spaces is severely restricted due to walls, ceilings, and other structural hindrances [Reference Ullah, Adhikari, Khan, Anwar, Ahmad and Bai12Reference Ullah, Su, Zhu, Zhang, Choi and Hou14]. Somewhat increased density in objects and human activity in indoor environments requires Indoor Positioning Systems (IPS) to account for constant changes in the surrounding environment. Hence, real-time accuracy becomes an essential requirement for IPS, because any delay or errors in localization may cascade problems in navigation and task execution. Examples in healthcare justify the demand for high precision. In a hospital, mobile robots delivering medications or sterilized equipment must navigate winding ways and avoid colliding with patients and staff members. In industrial settings, robots maneuver between racks in aisles and need centimeter-level accuracy for proper work and safety. Thus, these examples show that IPS serves not just as facilitators of robotic mobility but are interdependent components toward ensuring the success of many applications [Reference Peng, Liu, Wang, Xiang and Chen15Reference Lee, Chu, Sie, Lin, Hong and Huang17].

The evolution of IPS merges technological advances from various domains. Vision-based systems, for example, have seen a remarkable transformation by incorporating deep learning models capable of semantic segmentation and scene recognition. Neural networks allow the robot to build a map of the environment and identify and categorize objects inside the environment, thus providing the robot with some degree of contextual awareness [Reference Alotaibi, Alatawi, Binnouh, Duwayriat, Alhmiedat and Alia18, Reference Ni, Chen, Tang, Shi, Cao and Shi19]. This capability is highly valuable when interacting with specific objects, such as in industrial tasks like picking and placing. On the other hand, radio-based systems have tapped into innovations in signal processing and hardware miniaturization. Cost-effective UWB and Bluetooth systems have the potential to be developed in high quantities and deployed in many environments, such as retail shops and airports. Although scalability and inexpensive installation make them attractive, many are left wanting due to high electromagnetic activity, causing interference in their operations and degrading signals [Reference Che, Ahmed, Lazaridis, Sureephong and Alade20]. To solve these limitations, researchers are working on hybrid systems incorporating radio-based localization that work together with vision or inertial measurement units (IMUs) to boost accuracy and reliability. New technologies, such as synthetic aperture radar (SAR) and LiDAR, are expanding the scope of what IPS might accomplish. While SAR is traditionally used in aerospace and defense, it has undertaken new applications in mobile robotics to support high-resolution mapping in low-visibility environments [Reference Yin, Xu, Lu, Chen, Xiong, Shen, Stachniss and Wang21]. LiDAR continues to underpin indoor navigation, providing complex 3D maps that robots can navigate precisely. However, the prohibitive cost and computational complexity of LiDAR sensors are a key barrier stopping wide adoption of these systems, particularly in cost-saving applications [Reference Khan, Cheng, Uchiyama, Ali, Asshad and Kiyokawa22Reference Di Stefano, Chiappini, Gorreja, Balestra and Pierdicca25].

Although new technologies have pushed the limits of the performance of IPS, this development is not without its challenges. One of the main issues is balancing accuracy, scalability, and cost [Reference Gerwen, Geebelen, Wan, Joseph, Hoebeke and De Poorter26, Reference Long, Xiang, Lei, Li, Hu and Dai27]. High-precision systems like LiDAR and SAR frequently entail huge expenses and heavy demands for processing power, rendering them impractical for many applications. On the other hand, less expensive solutions, often based on Wi-Fi or Bluetooth, have poor accuracy and reliability, especially in environments with heavy interference or mobile obstacles [Reference Kim Geok, Zar Aung, Sandar Aung, Thu Soe, Abdaziz, Pao Liew, Hossain, Tso and Yong28]. Other challenges include the integration of multiple sensing modalities. Hybrid systems, which blend visual, inertial, and radio-based sensors, hold great promises to mitigate the limitations of each individual technology. Nevertheless, seamless integration relies on sophisticated sensor fusion algorithms to process and reconcile data from impossibly disparate sources in real time [Reference He and Chan29]. This is a nontrivial task, particularly because each sensor type has its own noise characteristics, biases, and latency issues.

The role of intelligent machines in overcoming these challenges cannot be overstated. Machine learning algorithms, primarily built on foundation drivers such as reinforcement and federated learning, introduce new capabilities in the IPS field. Such systems can leverage past data to adjust to changing environments, thus improving their robustness and accuracy further down the line. In the case of the federated learning models, multiple collaborator machines can work together on improving the localization ability while assuring data privacy, a high consideration in any IBM application, such as healthcare and retail [Reference Villacrés, Zhao, Braun and Li30Reference Na, Rouček, Ulrich, Pikman, Krajník, Lennox and Arvin32]. The evolution of IPS will show more possible opportunities for changes in the economy in industries and redefine their relations with humans via robots [Reference Gonçalves, de Caldas Filho, Martins, Kfouri, Dutra, Albuquerque and de Sousa33]. The synergy between IPS and the greater IoT ecosystem could lead to possibilities we might not have imagined before, where robots, innovative technologies, and even humans collaborate on this [Reference Kumar, Abhishek, Ghalib, Shankar and Cheng34Reference Farahsari, Farahzadi, Rezazadeh and Bagheri36]. In the case of smart cities, IPS would enable delivery robots to work cooperatively to find optimal routes based on current traffic and environmental data; similar ideals extend to healthcare, wherein IPS, along with wearable devices, could establish patient monitoring and support functions at an advanced level [Reference Andreu-Perez, Leff, Ip and Yang37, Reference Guk, Han, Lim, Jeong, Kang, Lim and Jung38]. With that aside, however, there is still much work to do. IP implementation must be established safely and ethically, protecting data privacy and security. Moreover, to push the widest possible entry into the industry and ease coexistence with others, measures must be taken to develop standards and benchmarks to evaluate IPS performance.

In contrast to existing literature, which predominantly offers technology-specific or siloed perspectives, this review provides a comprehensive and integrative synthesis of Indoor Positioning Systems (IPS) explicitly tailored to the needs of mobile robotics. Prior works, such as Rekkas et al. [Reference Rekkas, Iliadis, Sotiroudis, Boursianis, Sarigiannidis, Plets, Joseph, Wan, Christodoulou, Karagiannidis and Goudos39] focus narrowly on AI methodologies in Visible Light Positioning (VLP) systems, overlooking other modalities like SLAM or SAR, while Liu et al. [Reference Liu, Guo and Wei40] discuss indoor VLC systems in a generic context without technical depth on mobile robot adaptability or cross-modal fusion. Similarly, Panigrahi et al. [Reference Panigrahi and Bisoy41] offer a structured review of localization strategies using SLAM and probabilistic methods but exclude recent advancements in AI, edge computing, or federated architectures. Tan et al. [Reference Kim Geok, Zar Aung, Sandar Aung, Thu Soe, Abdaziz, Pao Liew, Hossain, Tso and Yong28] address RF-based IPS techniques but do not cover non-RF solutions and lack a robotic-centric outlook. Huang et al. and Yin et al. explore multiple IPS methods for mobile robots. Yet, their discussions are limited to classic methods and do not evaluate emerging technologies like synthetic aperture radar (SAR), real-time sensor fusion, or the implications of privacy-aware AI systems [Reference Yin, Xu, Lu, Chen, Xiong, Shen, Stachniss and Wang21, Reference Huang, Junginger, Liu and Thurow42]. Other articles, including those by Ullah et al. and Solanes & Gracia (2025), emphasize broad themes such as trajectory control and localization theory but provide neither empirical performance comparisons nor application-specific breakdowns in healthcare or public infrastructure [Reference Ullah, Adhikari, Khan, Anwar, Ahmad and Bai12, Reference Solanes and Gracia43]. This review fills these critical gaps by (i) unifying non-RF, RF, hybrid, and AI-powered IPS technologies under one framework; (ii) offering comparative performance metrics such as accuracy, latency, cost, and scalability; (iii) highlighting underexplored technologies like SAR, federated learning, and cross-modal sensor fusion; and (iv) mapping their real-world applicability in healthcare, industrial automation, education, and smart cities. By addressing both the technical and application layers, this work delivers a uniquely balanced, forward-thinking roadmap for innovation in mobile robot localization, offering relevance to both early-career researchers and industry professionals. This review is structured into thematic sections to guide readers through this multifaceted topic. This review is divided into sections that build on each other logically and are intended to systematically introduce the IPS technologies applicable to mobile robotics. Section 2 defines and discusses standardized performance metrics and benchmarking tools applicable to evaluating IPS, allowing for an objective comparison base. Section 3 comprehensively categorizes IPS technologies, starting with non-RF categories, such as IMUs, LiDAR, infrared, VLC, and SLAM. Section 4 discusses the RF-based methods of IPS, namely Wi-Fi, Bluetooth, RFID, and UWB, with accompanying hybrid system discussions. Finally, Section 5 identifies major ongoing issues and research gaps such as trade-offs for accuracy vs cost, difficulty with sensor fusion, and privacy.

2. Performance metrics and benchmarking in IPS

Standardized performance metrics are critical for assessing and comparing different IPS technologies. Terms such as “accuracy,” “reliability,” and “efficiency” are frequently referenced in IPS literature; however, they are often left undefined or are used inconsistently. This section aims to clarify some essential metrics used to evaluate IPS performance and describe how they are measured and interpreted in practice. (1) Accuracy is one of the more important metrics to consider when assessing IPS performance. Accuracy refers to how closely an estimated position corresponds to the ground truth (actual position). It is generally expressed in meters or centimeters and computed as the Euclidean distance between estimated and true coordinates. In most real-world applications, anything less than one meter of accuracy is considered sufficient (like with industrial warehouses); however, sub-10-centimeter accuracy is often recommended for practical applications in medical robots, UAVs, or other precision healthcare tasks. Precision is another metric that is closely associated with accuracy. Unlike accuracy, (2) precision concerns the repeatability of position estimates under the same or similar environmental conditions. Precision is generally measured by the variance or standard deviation of position estimates. Significant precision in IPS performance means that the estimations are stable, and in static or semi-dynamic environments, little drift can result in compounding errors over time [Reference Pascacio, Casteleyn, Torres-Sospedra, Lohan and Nurmi44, Reference Elsanhoury, Makela, Koljonen, Valisuo, Shamsuzzoha, Mantere, Elmusrati and Kuusniemi45]. (3) Reliability, however, is the capability of the system to provide accurate localization consistently for a duration of time and over a range of conditions. The reliability of the system is usually displayed as a percentage of the time the system was localized within a set error window (e.g., <50 cm error) during operation; for example, if the system was localized within a set window 80% of the time, the reliability value is typically considered 80%. (4) Latency (response time) refers to the delay from when a positioning request is made until a valid position estimate is obtained. Latency is expressed in milliseconds, and with robotic navigation systems, it is usually expected to be <100 msto operate safely and efficiently in a real-time scenario. Infrared and VLC systems provide a lower latency than interfacing with a cloud-based AI-enhanced IPS. (5) Scalability is an important consideration, especially in environments where multi-user support or a much larger coverage area is necessary. Scalability denotes the IPS’s ability to maintain performance as the number of tracked objects or the covered area increases. Scalability can be quantified using objective parameters by monitoring any performance degradation with increasing load or listed subjectively by analyzing the system architecture [Reference Sandamini, Maduranga, Tilwari, Yahaya, Qamar, Nguyen and Ibrahim46]. (6) Another salient performance metric is coverage area; the maximum extent of indoor space in which the IPS can yield reliable indoor operation must be defined in pixels to square meters. Coverage area varies wildly between potential IPS technologies: For example, there are IPS solutions available or technically feasible in very confined spaces like hospital rooms, while there are some technologies that can service a bigger coverage area, such as warehouses, freight terminals, and shopping malls. Many IPS have distinct performance metrics that researchers and practitioners do not consider when evaluating routine applications. However, researchers and practitioners can consistently and comparably assess the performance of indoor positioning systems, using a selection of benchmarks and datasets. Benchmarks include these frameworks: EvAAL (evaluation of AAL systems) has existing benchmark test environments, the UJIIndoorLoc, a dataset used in Wi-Fi fingerprinting, and room-related IPS datasets used for spatial awareness, based on brochure, room Alive, or orbit data, are popular datasets for evaluating camera-based systems. In robotics IPS, SLAM Bench (benchmarking using simultaneous Localization and Mapping) or ROS (robotics operating system, or more specifically, map-based platforms) should become a standard benchmark for evaluation to consider and evaluate real-time performance, accuracy, energy efficiency, and consumption consideration under the assumptions of scale-configurable environments [Reference Li, Tang, Kim and Smith47]. By defining and framing performance metrics of accuracy, precision, reliability, latency, scalability, energy consumption, and coverage, this section better conveys the technical complexity of evaluating IPS. Researchers and practitioners can weigh and select positioning systems when and where they are relevant to environments and application-specific situations. A summary of these performance metrics is provided in Table I.

3. Classification of indoor positioning systems

This section describes how IPS classified them into non-radio-frequency, radio frequency, and hybrid systems. This classification highlights the technological diversity in IPS and various capabilities and limitations. The section compares LiDAR to Visual SLAM and Wi-Fi to UWB and compares their performances across different environments. Furthermore, it elaborates on hybrid systems that combine various technologies into one, gaining accuracy and efficiency by showing how and where they work in real applications. This extensive classification can justify the potential. The exposition provides a foundation for understanding India’s complex situation and the prospects of IPS technologies.

3.1. Non-radio frequency methods

This section focuses on non-radio-frequency methods, technologies that do not rely on radio signals for indoor positioning. Such methods occupy a conspicuous role in scenarios whereby the radio frequency-based systems experience difficulties – either due to interference or regulation restrictions.

3.1.1. Inertial measurement units (IMU)

Blind and moving around a room is how the IMU works for a robot. IMUs are electronic devices that fit within robots and can sense and even interpret their motion without external help, including signals from radio waves or light. An IMU comprises two components: an accelerometer that measures the acceleration, such as how fast the robot goes forward or backward, and a gyroscope that mostly senses if the robot tilts or turns. Some of these systems have a magnetometer as one of their components, which acts as a compass to determine the directional orientation [Reference Samatas and Pachidis48, Reference Brossard, Barrau, Bonnabel and Dead-Reckoning49]. Together, all this provides information crucial for robots to navigate indoor spaces. IMUs are simple to understand. An accelerometer senses if a robot is moving or not in the forward, backward, upward, or downward directions, quite like the feeling a person gets from the acceleration and deceleration of a vehicle [Reference Guo, Yang, Wu, Dong, Wu and Li50, Reference Hurwitz, Cohen and Klein51].

Figure 1. (A) The development of infrastructure-less navigation for healthcare logistics, taken from ref. [Reference Ramdani, Panayides, Karamousadakis, Mellado, Lopez, Christophorou, Rebiai, Blouin, Vellidou and Koutsouris52], with the permission of IEEE. (B) Binocular vision and IMU-based system for GPS-denied environments, taken from ref. [Reference Cheng, Dai, Peng and Nong53], copyright sage publication. (C) indoor mobile robots for navigation positioning, replicated from ref. [Reference Yan, Guo, Yu, Xu, Cheng and Jiang54], copyright Sage Publication. (D) IMU system-based indoor robots for infrastructure-independent localization, taken from ref. [Reference Ibrahim and Moselhi55], Copyright Elsevier. (E) Low- and medium-cost IMUs for automated guided vehicles for cost-effective navigation in industrial applications, taken from ref. [Reference Cramer, Cramer, de Schepper, Aerts, Kellens and Demeester56], Copyright Elsevier. (F) IMU-based system for trajectories in GPS-denied environments, taken from ref. [Reference Cole, Bozkurt and Lobaton57], Copyright MDPI.

By combining sophisticated perception capabilities, IMUs are changing the playing field for indoor navigation by tackling the complex technical challenges of localization. The ENDORSE project by Ramdani et al. is a prime example (Figure 1A). The ENDORSE project harnesses the accuracy of SLAM and fuses wireless sensors to create infrastructure-less robotic navigation in hospital scenarios. The ENDORSE project uses a dynamic modular architecture built on HLAA-compliant cloud infrastructure to carry out modular tasks such as UV sanitization or diagnostics, which can be dynamic and carry out hybrid tasks [Reference Ramdani, Panayides, Karamousadakis, Mellado, Lopez, Christophorou, Rebiai, Blouin, Vellidou and Koutsouris52]. Building on this, Cheng et al. (Figure 1B) proposed a system combining binocular vision with IMU data, where an asynchronous Kalman Filter fuses visual corner detection with inertial data to reduce drift, ensuring high-precision navigation even in low-texture environments [Reference Cheng, Dai, Peng and Nong53]. This evolution continues with Yan et al. (Figure 1C), who integrated LiDAR and IMU data using Kalman filtering to enhance positioning accuracy in dynamic or occluded spaces [Reference Yan, Guo, Yu, Xu, Cheng and Jiang54]. Shifting focus to rugged construction sites, Ibrahim et al. (Figure 1D) introduced a jerk-based IMU localization approach that uses triple jerk integration and barometric sensors for precise, infrastructure-free tracking [Reference Ibrahim and Moselhi55]. Cramer et al. (Figure 1E) have benchmarked low-cost IMUs for AGVs for scalable industrial applications to demonstrate performance like premium IMUs [Reference Cramer, Cramer, de Schepper, Aerts, Kellens and Demeester56]. The use of these low-cost IMUs is now being extended to more unconventional applications, as Cole et al. (Figure 1F) have shown the use of IMUs for biobotic insects in disaster robotics, where accurate path reconstruction was performed via machine learning. Both cases illustrate the evolution and versatility of IMUs in contemporary navigation systems [Reference Cole, Bozkurt and Lobaton57].

From healthcare to industrial automation and disaster response, these case studies underscore the transformative potential of IMUs when integrated with complementary technologies. Together, they vividly picture how precise indoor navigation systems reshape diverse industries. To consolidate the insights gained from the diverse applications of IMUs across various domains, Table II provides a comprehensive comparative analysis of the discussed case studies. This table captures each study’s unique contributions, strengths, limitations, and overarching trends, offering a clear perspective on the evolution and versatility of IMU-based systems. The table complements the detailed narratives by summarizing the technical nuances and practical applications, ensuring a holistic understanding of these innovative approaches.

Table II. Comparative analysis of IMU-based localization and application case studies.

3.1.2. Visible light communication (VLC)

VLC holds transformative potential for indoor remote mobile robotics systems, providing reliable, high-speed communication, and precise positioning, which are essential for autonomous navigation and operations. VLC uses LED lights as transmitters, modulating their intensity to encode data, which is then received by robots equipped with photodetectors or image sensors. This dual functionality of LEDs for illumination and communication makes VLC an energy-efficient and cost-effective solution for enhancing the capabilities of indoor robotics [Reference Luo, Fan and Li64]. VLC is becoming a very powerful technology for high-precision indoor positioning, especially in areas with limitations of traditional RF systems. Li et al. (Figure 2A) designed a VLC system using smart LED lamps with Bluetooth controls and LED-ID algorithms with a centimeter-level (sub-2.14 cm) accuracy. They supported robot speeds of up to 20 km/h. This makes it perfect for fast-paced, dynamic indoor environments like office spaces [Reference Li, Yan, Huang, Chen and Liu65]. To elaborate further, Guan et al. (Picture 2B) connected VLC to the Robot Operating System (ROS) for the TurtleBot3 robots using video tracking and double-lamp positioning combined with an enhanced version of Camshift–Kalman. The updates from the sensor resolved to an accuracy of 1 cm, with every update being processed every 0.4 s. This demonstrates how feasible the integration of VLC with ROS is going forward for robots at a higher level of sophistication [Reference Guan, Chen, Wen, Tan, Song and Hou66]. The next area of optimization is machine learning. Tran and Ha [Reference Tran and Ha67] (Figure 2C) achieved a 78.26% reduction in processing time and a 52.55% increase in accuracy using noise reduction and dual-function machine learning algorithms. In another study with at least partially different machine learning methods, Guo et al. introduced a Two-Layer Fusion Network to further improve localization through the integration of various fingerprints and classifiers, even when the variation of LED was burdened by variation in power [Reference Guo, Hu, Elikplim and Li68]. Then, also applied in practice, in the health sector, Murai et al. (Figure 2D) outfitted the HOSPI robot with LED mapping using VLC to support safe navigation of hospitals and avoid hazards on the pathways, for example, stairs [Reference Murai, Sakai, Kawano, Matsukawa, Kitano, Honda and Campbell69]. In the case of nuclear power plants, Xie et al. (Figure 2E) offer a VLC system enabling navigation using radiation-shielded LEDs and a dispersion-calibrated algorithm, which enabled accuracy within a few centimeters in high-radiation environments [Reference Xie, Huang and Wu70]. The illustrative case studies described above mention the various applications and advantages of VLC technology being utilized in indoor robotic systems, ranging from precise navigation to robust communications in challenging environments. These conclusions are supplemented by a tabular comparative analysis summarizing a consolidated account of these findings and what technological contributions can be derived from them.

Figure 2. (A) VLP system for mobile robots for dynamic indoor environments, taken from ref. [Reference Li, Yan, Huang, Chen and Liu65], Copyright Hindawi. (B) VLC-based localization system for indoor navigation, taken from ref. [Reference Guan, Chen, Wen, Tan, Song and Hou66], copyright arXiv. (C) Two-layer fusion network spanning industrial automation and smart buildings, taken from ref. [Reference Guo, Hu, Elikplim and Li68], copyright IEEE. (D) VLC-based autonomous delivery robot to improve hospital safety and navigation, taken from ref. [Reference Murai, Sakai, Kawano, Matsukawa, Kitano, Honda and Campbell69], copyright IEEE. (E) VLC-based positioning system for mobile robots in nuclear power plants, taken from ref. [Reference Xie, Huang and Wu70], copyright axXiv.

Moving off from the setup of experimentation and the design achievement from the case studies, entrapping a whole platform upon which VLC throws in all its bridges inside different applied fields, employing Table III.

Table III. Comprehensive analysis of VLC-based indoor robotics systems.

3.2. Non-radio frequency methods

Infrared systems have found extensive application in indoor robotics for communication, navigation, and detection of obstacles. They are based on invisible infrared light waves, generated from IR LEDs or similar equipment, and received by photodiodes or infrared cameras. The IR emitter, therefore, sends some light pulses, modulated to carry information regarding the distance of objects, their positions, or command instructions. The receiver detects the pulses and translates them into electrical signals, which are then processed to decode the transmitted data [Reference Kim Geok, Zar Aung, Sandar Aung, Thu Soe, Abdaziz, Pao Liew, Hossain, Tso and Yong28]. In navigation, IR systems often rely on triangulation to determine the robot’s position. By measuring the time it takes for infrared signals from multiple emitters to reach the robot, its location can be calculated with high precision. For obstacle detection, IR sensors emit light and measure the time it takes for the reflected signal to return or the strength of the reflection, helping the robot estimate the distance to nearby objects and avoid collisions. Additionally, infrared communication allows robots to exchange data in environments where radio frequencies might cause interference. The working principle of IR systems highlights their utility in indoor robotics for precise navigation, obstacle detection, and secure communication. These fundamental capabilities form the basis for various innovative applications across diverse environments, from small-scale setups to large, dynamic spaces [Reference Lanza, Carriero, Buijs, Mortellaro, Pizzi, Sciacqua, Biondetti, Angileri, Ianniello, Ierardi and Carrafiello75, Reference Fu, Corradi, Menciassi and Dario76].

For instance, Raharijaona et al. developed a minimalistic indoor localization system using flickering infrared LEDs and bio-inspired sensors. By utilizing amplitude-modulated infrared signals, the system achieves azimuth and elevation angle estimation with an accuracy of 2 cm at a 2 m range and a sampling frequency of 100 Hz. The compact design, 10 cm³ in size, weighing 6 g, and consuming just 0.4 W supports low-cost, energy-efficient operation, as shown in Figure 3 (A). The sensor demonstrated robustness to diverse lighting conditions, including darkness and flickering light, making it suitable for GPS-denied environments like indoor robotic applications. Its Arduino-compatible demodulator further emphasizes its accessibility and practical use in trajectory tracking [Reference Raharijaona, Mawonou, Nguyen, Colonnier, Boyron, Diperi and Viollet77]. Building on the theme of dynamic indoor positioning, Awad et al. introduced a collaborative approach to localize access points (APs) using a swarm of autonomous robots. By collecting non-uniformly distributed RSSI samples, the system efficiently estimates AP locations without prior knowledge of the environment. Tests confirmed its precision and reduced reliance on manual labor, demonstrating scalability and robustness for complex indoor settings. This solution provides a cost-effective way to address issues like rogue APs in wireless networks, as shown in Figure 3 (B) [Reference Awad, Naserllah, Omar, Abu-Hantash and Al-Taj78]. Extending to industrial environments, Cretu-Sîrcu et al. compared ultrasonic (GoT) and UWB technologies for indoor localization (as shown in Figure 3 (C)). Static tests showed localization errors of 0.3–0.6 m, while dynamic tests with a robot moving at 0.5 m/s revealed GoT’s superior accuracy of 0.1–0.2 m, compared to Pozyx’s 0.3–0.4 m. Although UWB excelled in mixed LoS/NLoS conditions, GoT was particularly effective for mobile robotics, meeting industrial accuracy requirements [Reference Crețu-Sîrcu, Schiøler, Cederholm, Sîrcu, Schjørring, Larrad, Berardinelli and Madsen79]. Finally, Qi and Liu presented a high-accuracy ultrasonic indoor positioning system (UIPS) based on wireless sensor networks. Using time-of-flight measurements and synchronized ultrasonic beacons, the system achieved a maximum localization error of 10.2 mm and a precision of 0.61 mm under line-of-sight conditions (as shown in Figure 3 (D)). Its cost-effective, robust design ensures suitability for dynamic, cluttered spaces, making it ideal for industrial and healthcare applications requiring high precision [Reference Qi and Liu80]. These studies illustrate the versatility and advancements in indoor localization technologies across varied applications and environments. These advancements in indoor localization demonstrate the growing diversity of techniques and technologies tailored to meet specific application needs, from robotics and industrial automation to healthcare and public spaces.

Figure 3. (A) Indoor localization system using flickering infrared LEDs and bio-inspired sensors suitable for GPS-denied environments like indoor robotic applications, taken from ref. [Reference Raharijaona, Mawonou, Nguyen, Colonnier, Boyron, Diperi and Viollet77], Copyright MDPI. (B) Swarm of autonomous robots for complex indoor settings, taken from ref. [Reference Awad, Naserllah, Omar, Abu-Hantash and Al-Taj78], Copyright MDPI. (C) Mobile robotics based on ultrasonic and UWB technologies for indoor localization, taken from ref. [Reference Crețu-Sîrcu, Schiøler, Cederholm, Sîrcu, Schjørring, Larrad, Berardinelli and Madsen79], copyright MDIP. (D) High-accuracy ultrasonic indoor positioning system (UIPS) based on wireless sensor networks, taken from ref. [Reference Qi and Liu80], copyright MDIP.

A detailed comparative analysis of selected IR-based indoor localization systems is provided in Table IV below. This comparison highlights the focus, strengths, limitations, key techniques, and applications of each system, offering insights into overarching trends shaping the development of these innovative solutions.

Table IV. Comparative study of indoor localization systems based on IR.

3.2.1. Light detection and ranging (LiDAR)

LiDAR technology measures distances and creates detailed maps for indoor environments using laser light. Technology consists of laser pulses that reflect off objects or surfaces, measuring the time the laser light travels back after reflecting. The measured time-of-flight data are then used to compute the distance to the object, which helps the robot get a better picture of what is around it. LiDAR systems do either a sweep or a rotation across a broad area to collect millions of data points, which are integrated together to create a 2D or 3D map of the environment. Within indoor robotics, it is paramount for navigation, obstacle detection, and mapping [Reference Sun, Zhao, Hu, Gao and Yu89]. Robots with LiDAR could precisely identify walls, furniture, and other objects, allowing them to move safely and plan efficient paths in dynamic environments. For example, a delivery robot in a hospital could use LiDAR to navigate through crowded hallways and avoid obstacles, such as other people or carts. Furthermore, LiDAR supports SLAM algorithms that enable robots to build and update maps dynamically while keeping track of where they are within those maps [Reference Qi, Wang, Liao, Zhang, Yang and Wei90].

LiDAR technology has become foundational in indoor robotics due to its precision, low-light operability, and capability to navigate complex layouts. Despite limitations such as poor performance on reflective or transparent surfaces and high costs, LiDAR remains crucial for accurate mapping and autonomous indoor navigation. A key example is the real-time LiDAR-based SLAM system developed by Zhang et al. (Figure 4A), which utilizes scan-to-map matching and adaptive loop closure to enhance mapping consistency and reduce drift. It’s integrated probabilistic data association ensures reliable localization even in dynamic environments [Reference Jiang, Wang, Yi, Zhang and Lv91]. Building on this, Wang et al. (Figure 4B) introduced a solution for improving LiDAR-based feature extraction using a weighted parallel ICP algorithm, which increases convergence speed and robustness, especially in structured indoor environments [Reference Wang, Peng, Ravankar and Ravankar92]. Building previous multi-sensor systems, Yilmaz and Temeltas (Figure 4C) created Self-Adaptive Monte Carlo Localization for smart AGVs incorporating 2D/3D LiDARs. Their energy model uses ellipses to be less sensitive to asymmetrical sensor placements in an industrial factory [Reference Yilmaz and Temeltas93]. Liu et al. (Figure 4D) further enhanced LiDAR localization by fusing data from IMU, odometry, and 3D LiDAR through an Extended Kalman Filter and PL-ICP, delivering accurate localization without GNSS [Reference Liu, Wang, Wu, Wei, Ren and Zhao94]. For UAVs, Kumar et al. (Figure 4E) integrated horizontally and vertically mounted LiDARs with IMUs to achieve 3D indoor navigation, which is useful in confined, dynamic environments like pipelines and disaster zones [Reference Kumar, Patil, Patil, Park and Chai95]. Lastly, Li et al. [Reference Li, Guan, Gao, Du, Wu, Guang and Cong24] created a hybrid indoor-outdoor navigation framework combining GNSS, INS, and LiDAR. This system seamlessly transitions between navigation environments with Hector SLAM and Kalman filtering. These studies collectively showcase LiDAR’s adaptability and essential role in enabling robust, accurate, and context-sensitive indoor localization solutions across robotics and autonomous systems.

Figure 4. (A) LiDAR-based SLAM system for autonomous robots, taken from ref. [Reference Jiang, Wang, Yi, Zhang and Lv91], copyright frontiers. (B) LiDAR-based robust for pose estimation in clean and perturbed environments [Reference Wang, Peng, Ravankar and Ravankar92], copyright MDPI. (C) self-adaptive Monte Carlo Localization algorithm tailored for smart automated guided vehicles position tracking, and kidnapping scenarios, taken from ref. [Reference Yilmaz and Temeltas93], copyright elsevier. (D) LiDAR localization method leveraging multi-sensing data from IMU, odometry, and 3D LiDAR for complex indoor spaces, taken from ref. [Reference Liu, Wang, Wu, Wei, Ren and Zhao94], copyright MDPI. (E) LiDAR and IMU integration for UAV indoor navigation, taken from ref. [Reference Kumar, Patil, Patil, Park and Chai95], copyright MDPI.

Such case studies illustrate the wide-ranging versatility of LiDAR technology in solving many localization and navigation problems. From single-sensor performance to multi-sensor integration, and from land-based robots to UAVs, LiDAR’s versatility highlights its importance in furthering the technology for autonomous robotics. Each study is founded on the last, showing a progressive refinement of techniques to improve reliability, accuracy, and computational efficiency in indoor robotics applications. A complete comparison Table V integrates the strengths, weaknesses, and main trends of these studies as a follow-up to the findings of such studies.

Table V. LiDAR research overview.

While LiDAR, VLC, and IR systems have demonstrated significant utility in indoor positioning, their suitability varies based on deployment needs. LiDAR offers centimeter-level precision and excels in 3D mapping, but its high hardware cost and computational demands limit its scalability in low-cost applications. In contrast, VLC systems provide high localization accuracy and dual use for lighting and communication. Still, they are susceptible to ambient lighting and require line-of-sight, making them less robust in dynamic environments. IR systems are cost-effective and energy-efficient, with moderate accuracy, but suffer from limited range and poor performance in environments with signal occlusion or thermal interference. Table VI presents a side-by-side comparison of these technologies using core evaluation metrics relevant to indoor robotic navigation.

3.2.2. Visual simultaneous localization and mapping (SLAM)

Visual Simultaneous Localization and Mapping (SLAM) is a technology that enables robots to build a map of their surroundings while simultaneously determining their location within that map. It relies on visual data captured by cameras, such as monocular, stereo, or RGB-D cameras, to extract environmental information. The process involves detecting and tracking key features, such as edges, corners, or textures, in consecutive frames of the camera feed. The robot’s camera captures images as it moves through the environment. Key features from these images are identified and matched across frames to estimate the robot’s movement and orientation (pose). Using these pose estimations, the robot continuously updates its position and integrates new observations into the map. Advanced algorithms, like Bundle Adjustment and Loop Closure Detection, refine the map to reduce errors caused by drift or repeated patterns [Reference Panigrahi and Bisoy41Reference Wang, Wang, Li, Ho, Cheng, Yan, Meng and Meng104].

Visual SLAM has become a cornerstone of autonomous indoor navigation, enabling robots to map and traverse unfamiliar environments. Its evolution through multi-sensor integration and advanced algorithms has led to diverse real-world applications. Roy et al (Figure 5A) [Reference Roy, Tu, Sheu, Chieng, Tang and Ismail100], presented an exploration-based SLAM (e-SLAM) framework solely using LiDAR sensors, including mapping, localization, and path planning using a generalized Voronoi algorithm. Controlled gains reflect a proportional increase in fidelity, and as shown, navigate both robustly and effectively by all measures with minimal hardware. Beyond navigation, SLAM is being utilized, for example, by Yang et al. [Reference Yang, Liu, Wang, Cao and Li105] (Figure 5B), as they let a SLAM-equipped robot monitor CO2 levels indoors by mapping results spatially to the temporal sensing, the SLAM robot had a source detection accuracy of 1.83 m, effectively combining spatial mapping and temporal sensing. This approach is more economical and flexible than static sensors for environmental monitoring. For visually sparse and repetitive environments, Chen (Figure 5C) [Reference Chen, Zhu, Wang and Liu106], proposed STCM-SLAM, fusing stereo vision and IMU data. By leveraging forward-backward optical flow and nonlinear optimization, this system outperformed ORB-SLAM2 and OKVIS in trajectory accuracy, proving effective in complex, low-texture settings. Singh et al. (Figure 5D) introduced a socially aware SLAM using adaptive neural networks concerning human-robot interaction. Tested at Chandigarh University, the system respected social norms and reduced the number of mapping iterations to support safe navigation around humans [Reference Singh, Kapoor, Thakur, Sharma, Nayyar, Mahajan and Shah107]. Finally, Wang et al. (Figure 5E) focused on 3D navigation in uneven terrains using RGB-D cameras and an enhanced RRT algorithm. Their OctoMap-based framework distinguished between slopes and staircases, ensuring safe movement through cluttered and physically complex indoor spaces [Reference Wang, Wang, Li, Ho, Cheng, Yan, Meng and Meng104]. These case studies highlight Visual SLAM’s adaptability from precise mapping and environmental sensing to socially intelligent and terrain-aware navigation, showcasing its transformative role in indoor robotics. Table VII provides a detailed comparative summary of various implementations of SLAM, outlining their domain of focus, strengths, shortcomings, techniques used, applications, and trends.

Figure 5. (A) SLAM framework that relies exclusively on liDAR sensors for indoor mobile robot navigation, taken from ref. [Reference Roy, Tu, Sheu, Chieng, Tang and Ismail100], copyright MDPI. (B) Indoor environmental monitoring, taken from ref. [Reference Yang, Liu, Wang, Cao and Li105], copyright Elsevier. (C) STCM-SLAM for precise pose estimation, taken from ref. [Reference Chen, Zhu, Wang and Liu106], copyright IEEE. (D) SLAM-based navigation systems for environments populated with humans, taken from ref. [Reference Singh, Kapoor, Thakur, Sharma, Nayyar, Mahajan and Shah107], copyright IEEE. (E) SLAM-based 3D OctoMap navigation system for complex 3D environments, taken from ref. [Reference Wang, Wang, Li, Ho, Cheng, Yan, Meng and Meng104], copyright MDPI.

Table VII. Summarizing key details like focus, strengths, limitations, key techniques, applications, and overarching trends.

3.2.3. Comparative performance across scenarios

Other positioning technologies have been revised and will develop diverse systems suited for each one of the challenges specific to environmental types and applications. IMU, VLC, Infrared Systems, LiDAR, and Visual SLAM differ yet have different advantages and limits depending on the situation. Before selecting the appropriate solution for a specific use case, every technology must be assessed against position accuracy, cost-effectiveness, robustness, and scalability. Table VIII below provides a detailed comparison and contrast of these technologies, as well as their strengths and weaknesses in different settings, and illustrates the importance of both environmental conditions and application needs, along with system integration, to ensure the effectiveness of each positioning methodology.

4. Radio frequency methods

First, RF methods have become essential for enabling wireless communication between robots and effective navigation. RF signals from Wi-Fi, Bluetooth, and RFID help robots locate objects, map environments, and keep connections in real time. Above all, such methods have proven efficient when applying other traditional sensors, such as cameras or LiDAR, becomes impossible. Based on the RF technology, robots can perform seamlessly in complex indoor environments [Reference Kim Geok, Zar Aung, Sandar Aung, Thu Soe, Abdaziz, Pao Liew, Hossain, Tso and Yong28].

4.1. Wi-fi-based indoor mobile robots

Wi-Fi-based indoor mobile robots depend on wireless internet signals for navigating, localizing, and communicating inside indoor spaces. The robots use a triangulation technique called Wi-Fi signal triangulation or fingerprinting to anchor their position [Reference He and Chan29]. To determine its position, the robot integrates the Received Signal Strength Indicator (RSSI) of several Wi-Fi access points in the building. By matching these measurements with a map of the environment built before, the robot can estimate its actual position with reasonable accuracy during calibration. Apart from this, Wi-Fi plays a vital role in real-time communication: connected to a Wi-Fi network, the robot will send and receive updates about itself, readings from its sensors, or even instructions. This type of communication allows the operators to control the robot from a distance via another device. It enables two or more robots of other types to cooperate as a group, sharing information in real time. Wi-Fi also allows connecting to the cloud, accessing advanced computing resources the robot can use, or sharing information for other purposes, such as complex decision-making or machine learning [Reference Singh, Choe and Punmiya121, Reference Upadhyay, Rawat, Deb, Muresan and Unguresan122].

Indoor localization has witnessed transformative progress by integrating Wi-Fi, machine learning, and robotics, tackling long-standing accuracy, adaptability, and scalability challenges. Shu et al. (Figure 6A) proposed a multimodal localization approach combining 3D point cloud data with Wi-Fi fingerprinting to estimate 6-DoF robot poses. By reducing search complexity and mitigating noise in large-scale settings (650+ million points), their fusion method proves highly effective in intricate indoor environments [Reference Shu, Chen and Zhang123]. Furthermore, Ayyalasomayajula et al. (Figure 6B) proposed DLoc utilizing deep learning in combination with MapFind, a self-mapping platform. This combination can allow for the creation of a large-scale, labeled dataset from mapping and deeper modeling, improving accuracy and reducing manual input while also being able to tolerate multipath errors and sparse maps [Reference Ayyalasomayajula, Arun, Wu, Sharma, Sethi, Vasisht and Bharadia124]. Turning to passive systems, Chan et al. (Figure 6C) created an entropy-optimized passive Wi-Fi localization system using genetic algorithms to evaluate optimal placement of Wi-Fi sniffers. Their system achieved 2.2-m accuracy while providing a cheaper and device-free option for tracking in real time [Reference Chan, Chao and Wu125]. To improve data collection, Lin et al. (Figure 6D) presented a hybrid deep learning approach using supervised, semi-supervised, and unsupervised learning with robot-collected RSSI data, enabling incremental learning and adaptability in obstacle-rich settings [Reference Lin, Gan, Jiang, Xue and Liang126]. Finally, Kharmeh et al. (Figure 6E) presented a low-cost robotic solution for generating automatic 3D Wi-Fi radio maps. Using a combination of SLAM and data fusion, the scalable and low-cost architecture facilitates automatic collection and mapping of Wi-Fi radio maps at a significant reduction in labor consumption and energy use when deploying to larger scales [Reference Kharmeh, Natsheh, Sulaiman, Abuabiah and Tarapiah127]. Together, these studies reflect how multimodal integration, AI, and robotic automation reshape indoor localization systems, making them more precise, scalable, and adaptive to dynamic environments. Table IX consolidates major prospects into equally compelling trends, driving innovations in this field.

Figure 6. (A) A multimodal approach combining 3D point clouds and Wi-Fi signals to achieve pose estimation for mobile robots was taken from ref. [Reference Shu, Chen and Zhang123], copyright IEEE. (B) Deep learning-based system that pairs neural networks with MapFind, an autonomous mapping platform, taken from ref. [Reference Ayyalasomayajula, Arun, Wu, Sharma, Sethi, Vasisht and Bharadia124]. (C) Wi-Fi-based indoor positioning system, taken from ref. [Reference Chan, Chao and Wu125], copyright MDPI. (D) wi-fi RSSI-based indoor Robots for obstacle-rich environments, taken from ref. [Reference Lin, Gan, Jiang, Xue and Liang126], copyright MDPI. (E) 3D Wi-Fi localization using low-cost robots for large-scale deployments, taken from ref. [Reference Kharmeh, Natsheh, Sulaiman, Abuabiah and Tarapiah127], copyright MDPI.

Table IX. Comparative analysis for wi-fi-based indoor localization techniques.

4.2. Radio frequency identification (RFID)-based indoor mobile robots

RFID-based indoor mobile robots are used to navigate and perform tasks within specific indoor fields or environments with Radio Frequency Identification (RFID) technology. In this system, RFID tags are placed at critical, specific locations as markers or waypoints that contain unique identification information for the robot to read from an RFID reader. The robot scans these tags while moving to determine where it is and verify that it is going to the right place. The RFID reader used by the robot emits signals to detect nearby tags, and the information is processed using other sensors, such as cameras or ultrasonics, to avoid obstacles and issue commands. This method is very efficient for the purposes of tracking the robot’s position and guiding its movement, without the use of GPS, for suitable indoor applications, like warehouses, hospitals, and offices [Reference Motroni, Buffi and Nepa138, Reference Oguntala, Abd-Alhameed, Jones, Noras, Patwary and Rodriguez139]. Building on the advancements in indoor navigation, Demiral et al. [Reference Álvarez-Aparicio, Guerrero-Higueras, Rodríguez-Lera, Clavero, Rico and Matellán140], presented a modular RFID-guided robot prototype for structured environments. Using strategically placed RFID tags and auxiliary sensors such as gyros and ultrasonic detectors, the system enables autonomous pathfinding through shortest-path algorithms, as shown in Figure 7 (A). This practical, cost-effective solution sets a foundation for more advanced navigation systems in emergency and service applications. Extending these principles, Wu et al. [Reference Wu, Tao, Gong, Yin and Ding141], developed a standalone RFID-based navigation method using phase-difference modeling (shown in Figure 7 (B)). This innovative approach eliminates the need for additional sensors or reference tags, achieving precise localization with a distance accuracy of 4.04 cm, showcasing RFID’s potential for unstructured navigation. Taking precision navigation further, Kammel et al. [Reference Kammel, Kogel, Gareis and Vossiek142] introduced a hybrid system that integrates UHF RFID and odometry for centimeter-level localization, shown in Figure 7 (C). This system proves its robustness in warehouse environments by addressing odometry drift and multipath interference through iterative Kalman filtering. Building on this, Shangguan and Jamieson [Reference Shangguan and Jamieson143], tackled sorting closely spaced RFID-tagged items in dense environments. Their MobiTagbot system leverages synthetic aperture radar techniques to achieve nearly 100% accuracy, making it a breakthrough for libraries and supply chains, as shown in Figure 7 (D). Similarly, DiGiampaolo and Martinelli [Reference Digiampaolo and Martinelli144], focused on robotic localization in shelves (shown in Figure 7 (E)), combining odometry and RFID signal analysis to achieve high precision (∼10 cm error) in cluttered scenarios like metallic storage racks. Beyond single-robot applications, cooperative approaches have been explored. Seco and Jiménez [Reference Seco and Jiménez145] proposed a smartphone-based localization system that integrates RFID tags, pedestrian dead reckoning (PDR), and map data, reducing errors from 6.1 m to 1.6 m through collaborative tracking. This complements the low-cost HF RFID system by Mi and Takahashi [Reference Mi and Takahashi146], which optimizes sparse tag placement and achieves millimeter-level accuracy, broadening the use cases for service robots in public facilities. Meanwhile, Ye and Peng [Reference Ye and Peng147] improved WiFi-based fingerprinting for robot navigation by refining grid-based points and adaptive correction, achieving accuracy within 0.4 m for dynamic indoor tasks. Closing the loop, Da Mota et al. [Reference Da Mota, Rocha, Rodrigues, De Albuquerque and De Alexandria148] integrated Petri nets with RFID tags for structured navigation in labyrinth-like spaces, while Kassim et al. [Reference A., Yasuno, Suzuki, H. and M.149], extended RFID’s reach to assist visually impaired individuals, combining tactile paving and digital compasses for inclusive indoor mobility. These studies illustrate RFID’s versatility, spanning precision robotics, collaborative systems, and accessible technologies. The diverse applications and innovations in RFID-based indoor robotics demonstrate the adaptability and precision of this technology across various fields. The advancements highlighted above showcase various approaches tailored for specific use cases, from structured navigation to unstructured environments, and single-robot systems to collaborative networks. Table X comprehensively compares the reviewed studies to illustrate these findings further, detailing their focus, strengths, limitations, key techniques, and applications.

Figure 7. (A) RFID-guided robot prototype for structured environments, taken from ref. [Reference Álvarez-Aparicio, Guerrero-Higueras, Rodríguez-Lera, Clavero, Rico and Matellán140]. (B) RFID-based standalone navigation method, taken from ref. [Reference Wu, Tao, Gong, Yin and Ding141], copyright IEEE. (C) RFID and odometry for centimeter-level localization robustness in warehouse environments, taken from ref. [Reference Kammel, Kogel, Gareis and Vossiek142], copyright IEEE. (D) RFID-tagged items in dense environments, taken from ref. [Reference Shangguan and Jamieson143]. (E) RFID-based indoor robot for detecting items localized in shelves, taken from ref. [Reference Digiampaolo and Martinelli144], copyright IEEE.

Table X. Comparison of RFID-related research papers.

4.3. Ultra-wideband (UWB) and bluetooth-based indoor mobile robots

UWB and Bluetooth-based indoor mobile robots use advanced wireless technologies to determine their location and navigate within indoor spaces. UWB operates by sending very short radio pulses across a wide frequency range. These pulses travel to multiple fixed anchors in the environment, and the time for the signal to travel to and from them is measured. Using this “time-of-flight” data, the robot can calculate its precise position with high accuracy, often within a few centimeters. On the other hand, Bluetooth technology, particularly Bluetooth Low Energy (BLE), works by detecting signal strength (RSSI) from beacons placed around the area. The robot can estimate its position by analyzing these signal strengths and sometimes combining them with other methods like triangulation [Reference Alarifi, Al-Salman, Alsaleh, Alnafessah, Al-Hadhrami, Al-Ammar and Al-Khalifa151, Reference Gu and Ren152]. Together, these technologies can complement each other. UWB provides high precision, ideal for tasks requiring fine control, while Bluetooth offers cost-effective and energy-efficient positioning for broader navigation. Using these systems, the robot can map its surroundings, avoid obstacles, and move efficiently to complete its tasks in warehouses, hospitals, or smart homes.

Recent developments in indoor localization have demonstrated the effectiveness of combining multiple sensing approaches to alleviate some of the problems of accuracy, robustness, and adaptability. A noteworthy example of this is in the work of Kok et al. [Reference Kok, Hol and Schon153], which presented a tightly coupled UWB-IMU fusion system through a maximum a posteriori (MAP) formulation. Their approach uses a heavily tailed asymmetric distribution to filter UWB data for outliers, indicating improved pose estimation compared to optical tracking, even in a non-line-of-sight (NLOS) tracking condition. Expanding on this, Yao et al. [Reference Yao, Wu, Yao and Liao154], leveraged an Extended Kalman Filter (EKF) to fuse UWB and IMU data, effectively addressing inertial drift and UWB multipath effects. Their system delivered over 100% improvement in accuracy compared to traditional UWB-only approaches, proving the synergy of complementary technologies in real-world lab and simulation environments. Investigating Bluetooth-based solutions, Weinmann and Simske [Reference Weinmann and Simske155], Introduced a Bluetooth 5.1 Angle of Arrival (AoA) system for autonomous robots and demonstrated a 0.12-m mean localization accuracy through beacon-based corrections. The possibility of utilizing this in scenarios like fire rescue and capturing objects under harsh indoor conditions is promising. Furthering the use of UWB, Juston and Norris [Reference Naheem, Elsharkawy, Koo, Lee and Kim156] developed an ad hoc mesh network localization system for mobile robots. Using UWB and odometry with an unscented Kalman filter, the decentralized setup enabled real-time adaptability and dynamic environment compatibility, allowing mobile agents to self-correct and synchronize locations without fixed infrastructure. In a unique application, Naheem et al. [Reference Naheem, Elsharkawy, Koo, Lee and Kim156] created a lighter-than-air helium robot with a wearable UWB sensor for user-following and intent detection. The interactive system could successfully track pose in open indoor spaces, providing opportunities for applications in entertainment, guidance, and user awareness. These case studies illustrate the versatility of UWB, IMU, and Bluetooth technologies, especially when used with advanced filtering and control algorithms. They offer a promising path toward scalable, accurate, and adaptive indoor positioning systems, catering to diverse domains from robotics to human-interactive applications.

These case studies collectively demonstrate how indoor positioning technologies, whether based on UWB, BLE, or hybrid sensor fusion, are evolving to meet the complex demands of real-world applications. From handling non-line-of-sight conditions to enabling real-time collaboration among mobile agents, these systems reflect modern IPS research’s promise and intricacies. To further contextualize the capabilities and trade-offs of various radio frequency-based IPS technologies, Table XI below presents a comparative analysis of commonly used RF methods. This table evaluates each technology across essential performance metrics such as accuracy, scalability, cost, latency, robustness, and environmental suitability, offering a concise yet informative summary for readers and practitioners exploring optimal IPS design strategies.

5. Research gaps and future directions

IPS are critical to enabling autonomous mobile robots to navigate and perform tasks effectively in complex indoor environments. While significant advancements have been made, many challenges hinder the full realization of robust and scalable IPS solutions. This section identifies some key research gaps and suggests some actionable solutions toward addressing them, paving the way for future innovations. IPS are critical to enabling autonomous mobile robots to navigate and perform tasks effectively in complex indoor environments. While many advancements have been made, several challenges will keep IPS solutions from being robust and scalable. In this section, research gaps will be identified, and potential solutions will be proposed.

5.1. Signal interference and multipath effects – deep learning-based mitigation strategies

IPS, especially those that rely on RF with Wi-Fi, Bluetooth, and UWB as the fidelity, will face two basic radio issues: signal interference and multipath effects. These challenges arise due to the inherent nature of indoor environments, which are typically filled with metallic objects, thick walls, moving people, and other electromagnetic barriers. When the RF signals meet obstructions, they will reflect, diffract, or scatter (or in combination), leading to multipath. In this process, signals reach the receiver through multiple paths with varying delays and attenuations, distorting the original signal. Additionally, electromagnetic interference from co-located devices such as smartphones, microwave ovens, routers, and even other localization systems can further degrade signal quality and reliability [Reference Kim Geok, Zar Aung, Sandar Aung, Thu Soe, Abdaziz, Pao Liew, Hossain, Tso and Yong28, Reference Yadav and Sharma157]. In addition, these interferences vary with floor plans, materials, and ambient conditions; it is nearly impossible to develop a universal interference mitigation technique that is robust across all use cases. Therefore, IPS systems often require environment-specific calibration, limiting generalizability and plug-and-play deployment.

The degree to which these signal distortions jeopardize localization accuracy is significant. Specifically, multipath propagation can cause incorrect distance estimation (e.g., delayed paths being incorrectly registered as part of the travel distance). On the same note, the variable signal strengths caused by interference produce variable RSSI (Received Signal Strength Indicator) readings on which fingerprinting techniques rely. These challenges are particularly severe in densely populated or mission-critical scenarios, like hospitals, warehouses, and manufacturing robots, in which robots must navigate accurately and promptly make decisions. In these cases, small errors in location estimation can mean failures, inefficiencies, or risks to safety. Conventional signal processing methods have been widely employed. Techniques like Kalman and particle filters smooth noisy signal trajectories by predicting and correcting the robot’s position over time. Channel State Information (CSI) filtering aims to acquire a stable reconstructed signal component from noisy multipath conditions. Meanwhile, frequency-hopping spread spectrum techniques exploit several frequencies to avoid persistent interference. Despite these advantages, these model-based approaches rely on relatively static conditions. Whether model-based or not, they can’t generalize or become flexible in highly dynamic or non-linear indoor spaces, leading to limited use over extended time [Reference Khodarahmi and Maihami158].

To overcome these shortcomings, DL has obvious potential for real-time signal correction and multipath mitigation for indoor positioning systems. Moreover, DL models are data-driven, meaning they can learn complex and nonlinear relationships between noisy input signals (such as raw CSI, ToA, and phase differences) and true position outputs. They can work with high-dimensional input spaces and be trained to find patterns in signal distortion that were impossible to model explicitly. More importantly, DL systems are flexible and can continually adapt to new environments or conditions. This unique property makes them perfectly suited to dynamic indoor spaces where conventional models ultimately fail. The task of performing denoising, feature extraction, classification, and regression together makes robust end-to-end positioning pipelines possible. The following Table XII summarizes the relative use of the different DL models for signal correction and multipath mitigation [Reference Tai, Liu, Wang, Shan and He159, 160].

Table XII. Deep learning models for signal correction and multipath mitigation [Reference Zhang, Lee and Choi161, Reference Chen and Chang162].

In the future, DL can enable many new opportunities to improve indoor positioning systems. For example, we could consider different types of deep learning frameworks, using CNNs to understand the shape of signals and LSTMs to capture how those signals may change over time. This collaborative modeling may yield more accurate and reliable positioning. Another challenge will be to shrink and accelerate the models’ size sufficiently so that they can be operated directly on a robot or other small device without direct access to a cloud data server. While this would put decision-making on the spot for a robot, it would not eliminate the need for a sizeable data store to learn from in the first place. Furthermore, it is tough to collect enough training data in many environments. Therefore, research in this area will explore ways to reuse models trained in one building and use that model in a new space (just as people quickly adjust to the layout of unfamiliar spaces). Lastly, creating data-sharing datasets and benchmarking tools would allow researchers to compare and assess existing systems and accelerate progress fairly. With all of these discussions and developments, we will undoubtedly see more applications of deep learning to ensure indoor navigation technology will be more innovative, quicker, and reliable than ever [Reference Lutakamale, Myburgh and de Freitas163].

5.2. Environmental variability and dynamic obstacles

The ever-changing nature of an indoor environment – a moving set of people, furniture, and equipment – challenges the ability of IPS to achieve suitably consistent accuracy. Many systems cannot respond to rapid environmental changes, degrading their localization and navigation performance. Real-time scene understanding that accounts for semantic mapping and identifying dynamic objects using convolutional neural networks with reinforcement learning may ameliorate the issues IPS systems face. Multi-sensor fusion techniques integrating vision systems, LiDAR, and inertial measurement units can provide more reliable localization. Predictive path planning algorithms can communicate with the navigation system to dynamically adjust navigation strategies while overcoming obstacles in real time [Reference Basiri, Lohan, Moore, Winstanley, Peltola, Hill, Amirian and Figueiredo e Silva5].

5.3. Scalability and deployment challenges

Scalability, in the context of IPS, can be understood as how an IPS continues to perform when deployed in large spaces, populous environments, or different building contexts, without losing accuracy, speed, or dependability. Though research has significantly progressed the IPS technological developments, scaling deployments in practice remains a frustrating barrier. One primary reason is that most IPS technologies do not function the same way everywhere. For instance, a system tuned to work in a hospital with long corridors and wide-open wards will not work the same in a shopping mall with glass storefronts, multiple floors, and thick walls. IPS solutions are frequently based on specific layouts or infrastructure (placing sensors, anchors, beacons, just the right way). However, in the real world, every single building is different, so each configuration needs to be manually tuned, and that is just not feasible on such a large scale [Reference Santoro, Nardello, Brunelli and Fontanelli164].

Cost and complexity are also issues. High-accuracy systems like LiDAR or radar can yield good results, but these systems can be costly, power-hungry, and bulky. Perhaps these systems are fine for research labs or high-budget projects, but they are not practical for typical environments like schools, retail stores, or homes. Even more "affordable" solutions like Bluetooth or infrared sensors can be unreliable in crowded, noisy environments or if other devices occupy the same frequency. Finally, there is the matter of computational load and real-time performance. Determining precise location and movement (some IPS functions) requires a lot of processing. We can send that data to the cloud, but it doesn’t just require a stable internet connection, and it causes latency. Even with an edge computing approach, we have to find the right hardware to process the data on the device. The cost may be a concern, plus it drains their battery life; this is especially problematic for small robots or mobile sensors. The use of inexpensive sensors, such as RGB-D cameras or passive infrared sensors, even as a concept, isn’t guaranteed to solve the problem either. Although these devices may be cheaper, they may not provide reliable accuracy, for example, in conditions where lighting changes, dynamic motion with people walking through the scene, or in a scene with many furniture or structural occlusions [Reference Asaad and Maghdid165, Reference Hayward, van Lopik, Hinde and West166].

Interoperability is another real and important bottleneck. IPS offerings from various vendors often don’t play nicely together [Reference Brunello, Montanari and Saccomanno167]. Because of this, it’s challenging to construct a cohesive system across a whole campus or a smart building. It’s like mixing up puzzle pieces from different boxes without common standards. So, with all the development and research, why haven’t we solved this? IPS is not just about a clever algorithm but about making sense of problems that cross physical space and boundaries, hardware, human behaviors, financial constraints, and privacy laws. And all of this varies significantly by location. Nonetheless, progress has been made. One avenue of exploration is with the use of cloud-edge hybrid models (where processing is done on-device, and heavier computation is offloaded to the cloud); another is in modular system design, where different functions (for example mapping, tracking, storage) can be tailor-fit to a specific environment; another is with systems that self-calibrate, tools that automate installation, or redundant sensors, which all seek to reduce the workload of installation. Regarding it, scalability within IPS is not a technological problem but a contextually bound, real-world problem. Finding a solution will take more than better hardware or more innovative software; it requires building systems that are flexible, cost-effective, easy to deploy, and change, resilient systems across an unlimited number of unique and changing indoor spaces [Reference Basiri, Lohan, Moore, Winstanley, Peltola, Hill, Amirian and Figueiredo e Silva5].

5.4. Accuracy in low-light and texture-less environments

Visual-based IPS has difficulty supporting SLAM algorithms in environments where lighting is constrained and distinctive visual features are lacking. Such hindrances substantially restrict the range of applicability of IPS in various areas, such as warehouses or tunnels. Enriched sensors, like infrared cameras integrated with thermal imaging, can enhance the operability of vision-based systems in situations of poor light. Generative adversarial networks could enhance image quality and extract features from texture-less areas. Combining visible light communication with infrared technology should also assist in increasing positioning accuracy in environments where visual identification is demanding [Reference Merveille, Jia, Xu and Fred168, Reference Ullah, Adhikari, Khan, Ahmad, Esposito and Choi169].

5.5. Long-term autonomy and adaptability

Most IPS solutions lack the ability to adapt to long-term changes in the environment, such as structural modifications or sensor degradation. This results in reduced reliability and increased maintenance requirements. Self-learning algorithms that continuously adapt to evolving environmental conditions can address these challenges. Autonomous map updating systems can integrate new data into existing maps without manual intervention. Incorporating redundancy mechanisms in sensor systems can ensure robustness against individual sensor failures [Reference Granig, Faller, Hammerschmidt and Zangl170].

5.6. Ethical and privacy concerns

IPS are quickly becoming part of every setting, from hospitals to factories, offices, and homes – essentially every place that should protect privacy. As these systems advance with AI and machine learning capabilities, we must consider their ethical privacy issues with the same level of care as their technical abilities. This section explores practical privacy risks, ethical obligations, and challenges with advanced methodologies such as federated learning (FL) when working with IPS-driven robotics.

5.6.1. Privacy risks in sensitive environments

IPS solutions and their relevance to health, health care, elder care, and workplaces usually rely on real-time location information from people or mobile agents. Often, the intent is safety (or the effort to plan for things, like dispatching a robot to someone to refill supplies or, more simply, tracking what a nurse is doing). Nevertheless, data that monitors people continuously becomes a surveillance concern depending on the context and situation in which the data is being collected. To illustrate this in context: In elder care homes, monitoring a patient’s movements in the event of a fall is beneficial; however, if monitoring is done without a straightforward process, transparency, options, or communication, there are ethical issues. Furthermore, assessing location information that could be misinterpreted, misused, or leaked into the real world can have serious ramifications. Some examples would include Stalking or harassment based on location, Revelation of employee habits (in the workplace), engaging with fixes, implementing improvements, and saving during a time-sensitive procedure, while being tracked by a competitor in another organization who wants to see how you behave, etc. These are a few situations that underscore the need for end-to-end encryption of data as a best practice, not only regarding the transfer of data but also after you have collected the data, as a best practice, to offer sustainability of your ethical commitment and confidence in data security to those represented [Reference Zhongna Zhou, Wenqing Dai, Eggert, Giger, Keller, Rantz and Zhihai He171, Reference Woolrych172].

5.6.2. Ethical responsibility and informed consent

Many IPS deployments experience what could be described as “invisible surveillance.” For example, individuals under surveillance do not necessarily know: (1) their location data is being recorded, (2) by whom it is being recorded, (3) how long it will be stored, and (4) if it can be deleted. Furthermore, a lengthy privacy policy document cannot bury proper informed consent. Having in-app notifications in the moment, using very simple language, and allowing them to opt in/ opt out is necessary to facilitate informed consent. Transparency and trust can also be achieved by giving users a dashboard or an app to allow them to control their consent. An example from healthcare: if a patient were to walk into a hospital that uses an IPS for navigation and safety purposes, a kiosk at the entrance or a user wristband with an app could inform the patient about the data that is going to be collected from them, and they can even control it. These micro-consent formats allow people to consent and have greater congruency with global data protection regulations [Reference Conway173, Reference Setty, Hunt and Ringrose174].

5.6.3. AI bias and fairness in IPS

AI-ML-powered IPS could mistakenly introduce bias, both in coverage and in accuracy. For example, a model-trained imitative of western hospital use in western hospital environments may suffer when used in hospitals in India and Japan, with different physical designs and layouts for rooms and hallways. This could create bad outcomes such as wrongly routed robots, poor localization performance, and a higher rate of errors for certain populations or geographical locations. Bias could be further compounded when utilizing historical data that reflects previously existing systemic biases. As such, the ethical design of AI in IPS must focus on utilizing diverse training data in its use cases and consider fairness criteria. Furthermore, algorithms should routinely be audited for bias and periodically retrained using new balanced datasets that represent inclusivity of geography, demographics, and physical layout of the built environment [Reference Che, Ahmed, Lazaridis, Sureephong and Alade20, Reference Zakerabasali and Ayyoubzadeh175, Reference Biondi, Cagnoni, Capobianco, Franzoni, Lisi, Milani and Vallverdú176].

5.6.4. Advanced privacy-preserving techniques

Modern privacy engineering offers several promising technology options that can help organizations ensure the privacy of IPS data. The most relevant examples are: (1) Differential Privacy: introduces mathematical noise into datasets so that aggregate patterns can be identified but individuals cannot be identified; (2) Homomorphic Encryption: allows for computations to occur while the data is encrypted meaning that raw data is never exposed; and (3) Blockchain-Based Audit Logs: creates log entries for each access or update of a dataset in a tamper-proof manner that holds the organization accountable. These technologies support organizations in meeting privacy standards like the GDPR (EU) and HIPAA (U.S.) while making the data available for potentially useful services based on IPS data.

5.6.5. FL in IPS: opportunities and challenges

FL is often characterized as a privacy-preserving ML method because the data stays on edge devices (like a robot or mobile agent) and only model updates are sent to an edge server. This means that the raw location data never leaves the space in which it is locally based, providing more privacy to a user. There are, however, some technical reasons why FL is not simply translatable to IPS. (1) Non-IID Data: IPS devices deployed in different contexts (like hospitals and malls) will collect different data types. This data non-uniformity (the non-IID distribution between the data) can lead federated models to be slow to converge or less effective. (2) Communication Overhead: FL often entails regularly taking updates of the models between edge devices and the server, whereby strong communication and consistency in that communication must exist. In situations involving edge devices, such as robots that are continuously moving (such as in a warehouse), it may be impractical to ensure communication. (3) Device Limitations: IPS nodes are either embedded systems in the device itself or robots that do not have access to significant data processing or battery. This limited capability is exacerbated by running tasks/users of on-device training, limiting their subsequent abilities. (4) Privacy Is Not Guaranteed: Even though raw data is not privy to sharing, there are still types of model update-based attacks that make it possible to leak raw data from being able to utilize a type of model inversion or gradients (gradient leakage).

These challenges are not just theoretical; they have been observed in real-world pilot implementations:

Together, these examples validate that FL offers promise for privacy in IPS but is not a plug-and-play solution. Its effectiveness depends on how well it is tailored to the deployment environment’s computational limitations, connectivity constraints, and data characteristics. As such, hybrid approaches combining FL with secure aggregation, differential privacy, and blockchain-based audit trails may offer more viable, robust privacy protection for IPS systems in real-world deployments [Reference Bharati, Mondal, Podder and Prasath180, Reference Zhu, Zhu, Ren and Qin181].

5.6.6. Ethical outlook

Privacy in IPS cannot be a box ticked; it must be a fundamental underpinning for ethical acceptance and trust. Regardless of whether it is an AI delivery robot in a hospital or a retail tracking system using UWB, transparency, control, and data minimization should be guiding factors. FL is still in its infancy, and while it holds promise, it will face challenges in the IPS context that will require further research. The IPS community can work towards innovative, fair, secure, and respectful systems by integrating technological safeguards with human-centered consent mechanisms and continuous ethical audits. Ultimately, ethical design in IPS is not just about compliance with laws and regulations; it is about respecting the individual in all the choices made by the system [Reference Truong, Sun, Wang, Guitton and Guo182, Reference Ali, Ullah, Al Shloul, Khan, Khan, Ghadi, Abdusalomov, Nasimov, Ouahada and Hamam183].

5.7. Integration with emerging technologies

Limited exploration exists regarding integrating IPS with emerging technologies such as the Internet of Things, 5G networks, and quantum sensing. This gap restricts IPS from achieving its full potential in modern applications. IoT-compatible IPS solutions can leverage IoT devices for enhanced localization and context-aware navigation. 5G networks’ high bandwidth and low latency can be exploited to improve real-time positioning accuracy. Quantum sensors offer the potential for achieving unprecedented accuracy and reliability in indoor positioning [Reference Ullah, Adhikari, Su, Palmieri, Wu and Choi184, Reference Ullah, Noor, Nazir, Ali, Ghadi and Aslam185].

5.8. Standardization and interoperability

The lack of standardized protocols and interoperability across different IPS technologies limits widespread adoption and integration. This fragmentation results in inefficiencies and compatibility issues in multi-system environments. Unified protocols developed through industry-wide collaboration can ensure compatibility between different IPS technologies. Open-source tools and frameworks can accelerate innovation and adoption. Interoperability frameworks that integrate multiple IPS technologies can enable seamless operations across diverse platforms [Reference Stavroulakis and Stamp186].

Addressing the outlined research gaps requires a multidisciplinary approach combining sensor technology advancements, artificial intelligence, and system integration. By overcoming these challenges, future IPS can achieve higher accuracy, robustness, and scalability, unlocking their full potential across diverse applications such as industrial automation, healthcare, and public safety. The proposed solutions address current limitations and pave the way for innovative applications, ensuring that IPS remains a cornerstone technology in the era of autonomous mobile robotics (as shown in Figure 8).

Figure 8. Challages and future direction.

6. Conclusion

Based on the detailed exploration of IPS in mobile robotics, this manuscript highlights the field’s transformative advancements, challenges, and future potential. IPS technologies such as LiDAR, Visual SLAM, ultra-wideband, and hybrid systems enable robots to navigate complex indoor environments precisely. Despite advancements, issues like cost, signal interference, and dynamic environmental conditions persist. The manuscript emphasizes the role of interdisciplinary innovation, integrating artificial intelligence and the Internet of Things, to overcome these barriers. Practical applications span healthcare, industrial automation, and public safety, showcasing IPS as a cornerstone for advancing robotic autonomy. By addressing current gaps and prioritizing privacy and ethical considerations, this study provides a roadmap for researchers and industry stakeholders to foster innovation and redefine the capabilities of indoor localization systems.

Author contributions

Conceptualization, Rushikesh Deshmukh; Data curation, Rushikesh Deshmukh and Meghana Hasamnis; Formal analysis, Meghana Hasamnis; Investigation, Rushikesh Deshmukh; Methodology, Rushikesh Deshmukh; Project administration, Madhusudan Kulkarni; Resources, Meghana Hasamnis; Supervision, Manish Bhaiyya; Visualization, Madhusudan Kulkarni; Writing – original draft, Rushikesh Deshmukh; Writing – review & editing, Meghana Hasamnis, Madhusudan Kulkarni and Manish Bhaiyya.

Funding

This research received no external funding.

Institutional review board statement

Not applicable.

Informed consent statement

Not applicable.

Competing interests

The authors declare no conflict of interest.

References

Semborski, J. and Idzkowski, A., “A review on positioning techniques of mobile robots,” Robot. Syst. Appl. 4(1), 3043 (2024). doi: 10.21595/rsa.2024.23893.CrossRefGoogle Scholar
Misaros, M., Stan, O. P., Donca, I. C. and Miclea, L. C., “Autonomous robots for services—State of the art, challenges, and research areas,” Sensors 23(10), 4962 (2023). doi: 10.3390/s23104962.CrossRefGoogle ScholarPubMed
Misra, A., Agrawal, A. and Misra, V., “Robotics in Industry 4.0,” In: Hussain, C. M. and Di Sia, P., Handbook of Smart Materials, Technologies, and Devices: Applications of Industry 4.0 (Springer International Publishing, Cham, 2022) pp. 20212055. doi: 10.1007/978-3-030-84205-5_68.CrossRefGoogle Scholar
Tong, Y., Liu, H. and Zhang, Z., “Advancements in humanoid robots: A comprehensive review and future prospects,” IEEE/CAA J. Autom. Sin 11(2), 301328 (2024). doi: 10.1109/JAS.2023.124140.CrossRefGoogle Scholar
Basiri, A., Lohan, E. S., Moore, T., Winstanley, A., Peltola, P., Hill, C., Amirian, P. and Figueiredo e Silva, P., “Indoor location based services challenges, requirements and usability of current solutions,” Comput. Sci. Rev. 24, 112 (2017). doi: 10.1016/j.cosrev.2017.03.002.CrossRefGoogle Scholar
Hailu, T. G., Guo, X., Si, H., Li, L. and Zhang, Y., “Theories and methods for indoor positioning systems: A comparative analysis, challenges, and prospective measures,” Sensors 24(21), 6876 (2024). doi: 10.3390/s24216876.CrossRefGoogle ScholarPubMed
Tang, M., Zhou, B., Zhong, X., Liu, X. and Li, Q., “Enhanced indoor positioning through human-robot collaboration,” Urban Informat. 3(1), 7 (2024). doi: 10.1007/s44212-024-00037-9.CrossRefGoogle Scholar
Fan, Q., Sun, B., Sun, Y., Wu, Y. and Zhuang, X., “Data fusion for indoor mobile robot positioning based on tightly coupled INS/UWB,” J. Navig. 70(5), 10791097 (2017). doi: 10.1017/S0373463317000194.CrossRefGoogle Scholar
Chghaf, M., Rodriguez, S., Ouardi, A. E. and Camera, “LiDAR and multi-modal SLAM systems for autonomous ground vehicles: A survey,” J. Intell. Robot. Syst. 105(1), 2 (2022). doi: 10.1007/s10846-022-01582-8.CrossRefGoogle Scholar
Zhu, J., Li, H., Zhang, T. and Camera, “LiDAR, and IMU based multi-sensor fusion SLAM: A survey,” Tsinghua Sci. Technol. 29(2), 415429 (2024). doi: 10.26599/TST.2023.9010010.CrossRefGoogle Scholar
Al-Okby, M. F. R., Junginger, S., Roddelkopf, T. and Thurow, K., “UWB-based real-time indoor positioning systems: A comprehensive review,” Appl. Sci. 14(23), 143 (2024). doi: 10.3390/app142311005.CrossRefGoogle Scholar
Ullah, I., Adhikari, D., Khan, H., Anwar, M. S., Ahmad, S. and Bai, X., “Mobile robot localization: Current challenges and future prospective,” Comput. Sci. Rev. 53(January), 100651 (2024). doi: 10.1016/j.cosrev.2024.100651.CrossRefGoogle Scholar
Niloy, M. A. K., Shama, A., Chakrabortty, R. K., Ryan, M. J., Badal, F. R., Tasneem, Z., Ahamed, M. H., Moyeen, S. I., Das, S. K., Ali, M. F., Islam, M. R. and Saha, D. K., “Critical design and control issues of indoor autonomous mobile robots: A review,” IEEE Access 9, 3533835370 (2021). doi: 10.1109/ACCESS.2021.3062557.CrossRefGoogle Scholar
Ullah, I., Su, X., Zhu, J., Zhang, X., Choi, D. and Hou, Z., “Evaluation of localization by extended kalman filter, unscented kalman filter, and particle filter-based techniques,” Wirel. Commun. Mob. Comput 2020(1), 8898672–15 (2020). doi: 10.1155/2020/8898672.Google Scholar
Peng, T., Liu, Q., Wang, G., Xiang, Y. and Chen, S., “Multidimensional privacy preservation in location-based services,” Futur. Gener. Comput. Syst. 93, 312326 (2019). doi: 10.1016/j.future.2018.10.025.CrossRefGoogle Scholar
Patnaik, M., Mishra, S., Indoor Positioning System Assisted Big Data Analytics in Smart Healthcare,” In: Connected e-Health: Integrated IoT and Cloud Computing, Mishra, S., González-Briones, A., Bhoi, A.K., Mallick, P.K. and Corchado, M. (Springer International Publishing, Cham, 2022) pp. 393415. doi: 10.1007/978-3-030-97929-4_18.CrossRefGoogle Scholar
Lee, C. R., Chu, E. T. H., Sie, M. J., Lin, L. T., Hong, M. Z. and Huang, C. C., “Application of indoor positioning systems in nursing homes: Enhancing resident safety and staff efficiency,” Sensors 24(18), 119 (2024). doi: 10.3390/s24186099.CrossRefGoogle ScholarPubMed
Alotaibi, A., Alatawi, H., Binnouh, A., Duwayriat, L., Alhmiedat, T. and Alia, O. M., “Deep learning-based vision systems for robot semantic navigation: An experimental study,” Technologies 12(9), 157 (2024). doi: 10.3390/technologies12090157.CrossRefGoogle Scholar
Ni, J., Chen, Y., Tang, G., Shi, J., Cao, W. and Shi, P., “Deep learning-based scene understanding for autonomous robots: A survey,” Intell. Robot. 3(3), 374401 (2023). doi: 10.20517/ir.2023.22.CrossRefGoogle Scholar
Che, F., Ahmed, Q. Z., Lazaridis, P. I., Sureephong, P. and Alade, T., “Indoor positioning system (IPS) using ultra-wide bandwidth (UWB)—For industrial internet of things (IIoT),” Sensors 23(12), 128 (2023). doi: 10.3390/s23125710.CrossRefGoogle ScholarPubMed
Yin, H., Xu, X., Lu, S., Chen, X., Xiong, R., Shen, S., Stachniss, C. and Wang, Y., “A survey on global liDAR localization: Challenges, advances and open problems,” Int. J. Comput. Vis. 132(8), 31393171 (2024). doi: 10.1007/s11263-024-02019-5.CrossRefGoogle Scholar
Khan, D., Cheng, Z., Uchiyama, H., Ali, S., Asshad, M. and Kiyokawa, K., “Recent advances in vision-based indoor navigation: A systematic literature review,” Comput. Graph. 104, 2445 (2022). doi: 10.1016/j.cag.2022.03.005.CrossRefGoogle Scholar
Kandalan, R. N. and Namuduri, K., “Techniques for constructing indoor navigation systems for the visually impaired: A review,” IEEE Trans. Hum.-Mach. Syst. 50(6), 492506 (2020). doi: 10.1109/THMS.2020.3016051.CrossRefGoogle Scholar
Li, N., Guan, L., Gao, Y., Du, S., Wu, M., Guang, X. and Cong, X., “Indoor and outdoor low-cost seamless integrated navigation system based on the integration of INS/GNSS/LIDAR system,” Remote Sens. 12(19), 121 (2020). doi: 10.3390/rs12193271.Google Scholar
Di Stefano, F., Chiappini, S., Gorreja, A., Balestra, M. and Pierdicca, R., “Mobile 3D scan liDAR: A literature review, geomatics,” Nat. Hazards Risk 12(1), 23872429 (2021). doi: 10.1080/19475705.2021.1964617.CrossRefGoogle Scholar
Gerwen, J. V.-V., Geebelen, K., Wan, J., Joseph, W., Hoebeke, J. and De Poorter, E., “Indoor drone positioning: Accuracy and cost trade-off for sensor fusion,” IEEE Trans. Veh. Technol. 71(1), 961974 (2022). doi: 10.1109/TVT.2021.3129917.CrossRefGoogle Scholar
Long, Z., Xiang, Y., Lei, X., Li, Y., Hu, Z. and Dai, X., “Integrated indoor positioning system of greenhouse robot based on UWB/IMU/ODOM/LIDAR,” Sensors 22(13), 4819 (2022). doi: 10.3390/s22134819.CrossRefGoogle ScholarPubMed
Kim Geok, T., Zar Aung, K., Sandar Aung, M., Thu Soe, M., Abdaziz, A., Pao Liew, C., Hossain, F., Tso, C. P. and Yong, W. H., “Review of indoor positioning: Radio wave technology,” Appl. Sci. 11(1), 144 (2021). doi: 10.3390/app11010279.Google Scholar
He, S. and Chan, S. H. G., “Wi-Fi fingerprint-based indoor positioning: Recent advances and comparisons,” IEEE Commun. Surv. Tutor. 18(1), 466490 (2016). doi: 10.1109/COMST.2015.2464084.CrossRefGoogle Scholar
Villacrés, J. L. C., Zhao, Z., Braun, T. and Li, Z., “A particle filter-based reinforcement learning approach for reliable wireless indoor positioning,” IEEE J. Sel. Areas Commun. 37(11), 24572473 (2019). doi: 10.1109/JSAC.2019.2933886.CrossRefGoogle Scholar
Sutera, E., Mazzia, V., Salvetti, F., Fantin, G. and Chiaberge, M., “Indoor point-to-point navigation with deep reinforcement learning and ultra-wideband,” ICAART. 2021 - Proc. 13th Int. Conf. Agents Artif. Intell. 1, 3847 (2021). doi: 10.5220/0010202600380047.CrossRefGoogle Scholar
Na, S., Rouček, Táš, Ulrich, Jř. D.;, Pikman, J., Krajník, Táš, Lennox, B. and Arvin, F., “Federated reinforcement learning for collective navigation of robotic swarms,” IEEE Trans. Cogn. Dev. Syst. 15(4), 21222131 (2023). doi: 10.1109/TCDS.2023.3239815.CrossRefGoogle Scholar
Gonçalves, D. G., de Caldas Filho, F. L., Martins, L. M., Kfouri, G. D., Dutra, B. V., Albuquerque, R. D. and de Sousa, R. T., “IPS Architecture for IoT Networks Overlapped in SDN,” In: 2019 Workshop on Communication Networks and Power Systems (WCNPS), (2019) pp. 16. doi: 10.1109/WCNPS.2019.8896297.CrossRefGoogle Scholar
Kumar, A., Abhishek, K., Ghalib, M. R., Shankar, A. and Cheng, X., “Intrusion detection and prevention system for an ioT environment, digit,” Commun. Netw. 8(4), 540551 (2022). doi: 10.1016/j.dcan.2022.05.027.Google Scholar
Ali, M. U., Hur, S. and Park, Y., “Wi-fi-based effortless indoor positioning system using ioT sensors,” Sensors (Switzerland) 19(7), 1496 (2019). doi: 10.3390/s19071496.CrossRefGoogle ScholarPubMed
Farahsari, P. S., Farahzadi, A., Rezazadeh, J. and Bagheri, A., “A survey on indoor positioning systems for ioT-based applications,” IEEE Internet Things J 9(10), 76807699 (2022). doi: 10.1109/JIOT.2022.3149048.CrossRefGoogle Scholar
Andreu-Perez, J., Leff, D. R., Ip, H. M. D. and Yang, G.-Z., “From wearable sensors to smart implants-–Toward pervasive and personalized healthcare,” IEEE Trans. Biomed. Eng. 62(12), 27502762 (2015). doi: 10.1109/TBME.2015.2422751.CrossRefGoogle ScholarPubMed
Guk, K., Han, G., Lim, J., Jeong, K., Kang, T., Lim, E. K. and Jung, J., “Evolution of wearable devices with real-time disease monitoring for personalized healthcare,” Nanomaterials-BASEL 9(6), 123 (2019). doi: 10.3390/nano9060813.Google ScholarPubMed
Rekkas, V. P., Iliadis, L. A., Sotiroudis, S. P., Boursianis, A. D., Sarigiannidis, P., Plets, D., Joseph, W., Wan, S., Christodoulou, C. G., Karagiannidis, G. K. and Goudos, S. K., “Artificial intelligence in visible light positioning for indoor ioT: A methodological review,” IEEE Open J. Commun. Soc. 4(September), 28382869 (2023). doi: 10.1109/OJCOMS.2023.3327211.CrossRefGoogle Scholar
Liu, X., Guo, L. and Wei, X., “Indoor visible light applications for communication, positioning, and security,” Wirel. Commun. Mob. Comput. 2021(1), (2021). doi: 10.1155/2021/1730655.CrossRefGoogle Scholar
Panigrahi, P. K. and Bisoy, S. K., “Localization strategies for autonomous mobile robots: A review,” J. King Saud Univer. - Comput. Inf. Sci. 34(8), 60196039 (2022). doi: 10.1016/j.jksuci.2021.02.015.CrossRefGoogle Scholar
Huang, J., Junginger, S., Liu, H. and Thurow, K., “Indoor positioning systems of mobile robots: A Review,” Robotics 12(2), 128 (2023). doi: 10.3390/robotics12020047.CrossRefGoogle Scholar
Solanes, J. E. and Gracia, L., “Mobile robots: Trajectory analysis, positioning and control,” Appl. Sci. 15(1), 355 (2025). doi: 10.3390/app15010355.CrossRefGoogle Scholar
Pascacio, P., Casteleyn, S., Torres-Sospedra, J., Lohan, E. S. and Nurmi, J., “Collaborative indoor positioning systems: A systematic review,” Sensors 21(3), 1002 (2021). doi: 10.3390/s21031002.CrossRefGoogle ScholarPubMed
Elsanhoury, M., Makela, P., Koljonen, J., Valisuo, P., Shamsuzzoha, A., Mantere, T., Elmusrati, M. and Kuusniemi, H., “Precision positioning for smart logistics using ultra-wideband technology-based indoor navigation: A review,” IEEE Access 10, 4441344445 (2022). doi: 10.1109/ACCESS.2022.3169267.CrossRefGoogle Scholar
Sandamini, C., Maduranga, M. W., Tilwari, V., Yahaya, J., Qamar, F., Nguyen, Q. N. and Ibrahim, S. R., “A review of indoor positioning systems for UAV localization with machine learning algorithms,” Electronics 12(7), 1533 (2023). doi: 10.3390/electronics12071533.CrossRefGoogle Scholar
Li, S., Tang, Z., Kim, K. S. and Smith, J. S., “On the use and construction of wi-fi fingerprint databases for large-scale multi-building and multi-floor indoor localization: A case study of the UJIIndoorLoc database,” Sensors 24(12), 3827 (2024). doi: 10.3390/s24123827.CrossRefGoogle ScholarPubMed
Samatas, G. G. and Pachidis, T. P., “Inertial measurement units (IMUs) in mobile robots over the last five years: A Review,” Designs 6(1), 17 (2022). doi: 10.3390/designs6010017.CrossRefGoogle Scholar
Brossard, M., Barrau, A., Bonnabel, S. and Dead-Reckoning, A. I.-I. M. U., “AI-IMU dead-reckoning,” IEEE Trans. Intell. Veh. 5(4), 585595 (2020). doi: 10.1109/TIV.2020.2980758.CrossRefGoogle Scholar
Guo, F., Yang, H., Wu, X., Dong, H., Wu, Q. and Li, Z., “Model-based deep learning for low-cost IMU dead reckoning of wheeled mobile robot,” IEEE Trans. Ind. Electron. 71(7), 75317541 (2024). doi: 10.1109/TIE.2023.3301531.CrossRefGoogle Scholar
Hurwitz, D., Cohen, N. and Klein, I., “Deep-learning-assisted inertial dead reckoning and fusion,” IEEE Trans. Instrum. Meas. 74, 19 (2025). doi: 10.1109/TIM.2024.3502825.CrossRefGoogle Scholar
Ramdani, N., Panayides, A., Karamousadakis, M., Mellado, M., Lopez, R., Christophorou, C., Rebiai, M., Blouin, M., Vellidou, E. and Koutsouris, D., “A safe, efficient and integrated indoor robotic fleet for logistic applications in healthcare and commercial spaces: The endorse concept,” Proc. - IEEE Int. Conf. Mob. Data Manag. 2019(Mdm), 425430 (2019). doi: 10.1109/MDM.2019.000-8.Google Scholar
Cheng, L., Dai, Y., Peng, R. and Nong, X., “Positioning and navigation of mobile robot with asynchronous fusion of binocular vision system and inertial navigation system,” Int. J. Adv. Robot. Syst. 14(6), 116 (2017). doi: 10.1177/1729881417745607.CrossRefGoogle Scholar
Yan, X., Guo, H., Yu, M., Xu, Y., Cheng, L. and Jiang, P., “Light detection and ranging/inertial measurement unit-integrated navigation positioning for indoor mobile robots,” Int. J. Adv. Robot. Syst. 17(2), 111 (2020). doi: 10.1177/1729881420919940.CrossRefGoogle Scholar
Ibrahim, M. and Moselhi, O., “Inertial measurement unit based indoor localization for construction applications,” Autom. Constr. 71, 1320 (2016). doi: 10.1016/j.autcon.2016.05.006.CrossRefGoogle Scholar
Cramer, M., Cramer, J., de Schepper, D., Aerts, P., Kellens, K. and Demeester, E., “Benchmarking low-cost inertial measurement units for indoor localisation and navigation of AGVs,” Proc. CIRP 86(March), 204209 (2020). doi: 10.1016/j.procir.2020.01.044.CrossRefGoogle Scholar
Cole, J., Bozkurt, A. and Lobaton, E., “Localization of biobotic insects using low-cost inertial measurement units,” Sensors (Switzerland) 20(16), 140 (2020). doi: 10.3390/s20164486.CrossRefGoogle ScholarPubMed
Chen, C., Lu, C. X., Wahlstrom, J., Markham, A. and Trigoni, N., “Deep neural network based inertial odometry using low-cost inertial measurement units,” IEEE Trans. Mob. Comput. 20(4), 13511364 (2021). doi: 10.1109/TMC.2019.2960780.CrossRefGoogle Scholar
Lu, C., Uchiyama, H., Thomas, D., Shimada, A. and Taniguchi, R. I., “Indoor positioning system based on chest-mounted IMU,” Sensors (Switzerland) 19(2), 120 (2019). doi: 10.3390/s19020420.Google ScholarPubMed
Ceron, J. D., Kluge, F., Küderle, A., Eskofier, B. M. and López, D. M., “Simultaneous indoor pedestrian localization and house mapping based on inertial measurement unit and bluetooth low-energy beacon data,” Sensors (Switzerland) 20(17), 121 (2020). doi: 10.3390/s20174742.CrossRefGoogle ScholarPubMed
El-Gohary, M. and McNames, J., “Human joint angle estimation with inertial sensors and validation with a robot arm,” IEEE Trans. Biomed. Eng. 62(7), 17591767 (2015). doi: 10.1109/TBME.2015.2403368.CrossRefGoogle ScholarPubMed
Zhang, L., Zhou, T. and Lian, B., “Integrated IMU with faster R-CNN aided visual measurements from IP cameras for indoor positioning,” Sensors (Switzerland) 18(9), 3134 (2018). doi: 10.3390/s18093134.CrossRefGoogle ScholarPubMed
Hislop, J., Isaksson, M., McCormick, J. and Hensman, C., “Validation of 3-space wireless inertial measurement units using an industrial robot,” Sensors 21(20), 113 (2021). doi: 10.3390/s21206858.CrossRefGoogle ScholarPubMed
Luo, J., Fan, L. and Li, H., “Indoor positioning systems based on visible light communication: State of the art,” IEEE Commun. Surv. Tutor. 19(4), 28712893 (2017). doi: 10.1109/COMST.2017.2743228.CrossRefGoogle Scholar
Li, X., Yan, Z., Huang, L., Chen, S. and Liu, M., “High-accuracy and real-time indoor positioning system based on visible light communication and mobile robot,” Int. J. Opt. 2020, 111 (2020). doi: 10.1155/2020/3124970.Google Scholar
Guan, W., Chen, S., Wen, S., Tan, Z., Song, H. and Hou, W., “High-accuracy robot indoor localization scheme based on robot operating system using visible light positioning,” IEEE Photonics J. 12(2), 116 (2020). doi: 10.1109/JPHOT.2020.2981485.CrossRefGoogle Scholar
Tran, H. Q. and Ha, C., “Improved visible light-based indoor positioning system using machine learning classification and regression,” Appl. Sci. 9(6), 1048 (2019). doi: 10.3390/app9061048.CrossRefGoogle Scholar
Guo, X., Hu, F., Elikplim, N. R. and Li, L., “Indoor localization using visible light via two-layer fusion network,” IEEE Access 7, 1642116430 (2019). doi: 10.1109/ACCESS.2019.2895131.CrossRefGoogle Scholar
Murai, R., Sakai, T., Kawano, H., Matsukawa, Y., Kitano, Y., Honda, Y. and Campbell, K. C., “A novel visible light communication system for enhanced control of autonomous delivery robots in a hospital,” 2012 IEEE/SICE Int. Symp. Syst. Integr. SII 3000, 510516 (2012). doi: 10.1109/SII.2012.6427311 2012.CrossRefGoogle Scholar
Xie, H., Huang, L. and Wu, W., “Indoor positioning system based on visible light communication for mobile robot in nuclear power plant,” arXiv preprint arXiv:2011.07771 (2020).10.1155/2020/3124970CrossRefGoogle Scholar
Guan, W., Huang, L., Hussain, B. and Yue, C. P., “Robust robotic localization using visible light positioning and inertial fusion,” IEEE Sens. J. 22(6), 48824892 (2022). doi: 10.1109/JSEN.2021.3053342.CrossRefGoogle Scholar
Guan, W., Huang, L., Wen, S., Yan, Z., Liang, W., Yang, C. and Liu, Z., “Robot localization and navigation using visible light positioning and SLAM fusion,” J. Light. Technol. 39(22), 70407051 (2021). doi: 10.1109/JLT.2021.3113358.CrossRefGoogle Scholar
Li, G., Sun, S., Gao, Y., Li, A. and Zhu, K., “Research and development of indoor positioning technology based on visible light communication,” Urban Lifeline 1(1), 116 (2023). doi: 10.1007/s44285-023-00011-y.CrossRefGoogle Scholar
Nguyen, Q. D. and Nguyen, N. H., “Mobile application for visible light communication systems: An approach for indoor positioning,” Photonics 11(4), 293 (2024). doi: 10.3390/photonics11040293.CrossRefGoogle Scholar
Lanza, C., Carriero, S., Buijs, E. F. M., Mortellaro, S., Pizzi, C., Sciacqua, L. V., Biondetti, P., Angileri, S. A., Ianniello, A. A., Ierardi, A. M. and Carrafiello, G., “Robotics in interventional radiology: Review of current and future applications, technol,” Cancer Res. Treat. 22 (2023). doi: 10.1177/15330338231152084.CrossRefGoogle Scholar
Fu, G., Corradi, P., Menciassi, A. and Dario, P., “An integrated triangulation laser scanner for obstacle detection of miniature mobile robots in indoor environment,” IEEE/ASME Trans. Mechatron. 16(4), 778783 (2011). doi: 10.1109/TMECH.2010.2084582.CrossRefGoogle Scholar
Raharijaona, T., Mawonou, R., Nguyen, T. V., Colonnier, F., Boyron, M., Diperi, J. and Viollet, S., “Local positioning system using flickering infrared LEDs,” Sensors (Switzerland) 17(11), 116 (2017). doi: 10.3390/s17112518.CrossRefGoogle ScholarPubMed
Awad, F., Naserllah, M., Omar, A., Abu-Hantash, A. and Al-Taj, A., “Collaborative indoor access point localization using autonomous mobile robot swarm,” Sensors (Switzerland) 18(2), 407 (2018). doi: 10.3390/s18020407.CrossRefGoogle ScholarPubMed
Crețu-Sîrcu, A. L., Schiøler, H., Cederholm, J. P., Sîrcu, I., Schjørring, A., Larrad, I. R., Berardinelli, G. and Madsen, O., “Evaluation and comparison of ultrasonic and UWB technology for indoor localization in an industrial environment,” Sensors 22(8), 125 (2022). doi: 10.3390/s22082927.CrossRefGoogle Scholar
Qi, J. and Liu, G. P., “A robust high-accuracy ultrasound indoor positioning system based on a wireless sensor network,” Sensors (Switzerland) 17(11), 2554 (2017). doi: 10.3390/s17112554.CrossRefGoogle ScholarPubMed
Martín-Gorostiza, E., García-Garrido, M. A., Pizarro, D., Salido-Monzú, D. and Torres, P., “An indoor positioning approach based on fusion of cameras and infrared sensors,” Sensors (Switzerland) 19(11), 130 (2019). doi: 10.3390/s19112519.CrossRefGoogle ScholarPubMed
Bernardes, E., Viollet, S. and Raharijaona, T., “A three-photo-detector optical sensor accurately localizes a mobile robot indoors by using two infrared light-emitting diodes,” IEEE Access 8, 8749087503 (2020). doi: 10.1109/ACCESS.2020.2992996.CrossRefGoogle Scholar
Gu, D. and Chen, K. S., “Design and performance evaluation of wiimote-based two-dimensional indoor localization systems for indoor mobile robot control, Meas,” J. Int. Meas. Confed. 66, 95108 (2015). doi: 10.1016/j.measurement.2015.01.009.CrossRefGoogle Scholar
Wang, J. and Takahashi, Y., “Indoor mobile robot self-localization based on a low-cost light system with a novel emitter arrangement,” ROBOMECH J. 5(1), (2018). doi: 10.1186/s40648-018-0114-x.CrossRefGoogle Scholar
Arbula, D. and Ljubic, S., “Indoor localization based on infrared angle of arrival sensor network,” Sensors (Switzerland) 20(21), 132 (2020). doi: 10.3390/s20216278.CrossRefGoogle ScholarPubMed
Tsun, M. T. K., Lau, B. T. and Jo, H. S., “An improved indoor robot human-following navigation model using depth camera, active IR marker and proximity sensors fusion,” Robotics 7(1), 4 (2018). doi: 10.3390/robotics7010004.CrossRefGoogle Scholar
Huang, Z., Zhu, J., Yang, L., Xue, B., Wu, J. and Zhao, Z., “Accurate 3-D position and orientation method for indoor mobile robot navigation based on photoelectric scanning,” IEEE Trans. Instrum. Meas. 64(9), 25182529 (2015). doi: 10.1109/TIM.2015.2415031.CrossRefGoogle Scholar
AL-Forati, I. S. A., Rashid, A. and Al-Ibadi, A., “IR sensors array for robots localization using K means clustering algorithm,” Int. J. Simul. Syst. Sci. Technol., 27 (2019). doi: 10.5013/ijssst.a.20.s1.12.CrossRefGoogle Scholar
Sun, J., Zhao, J., Hu, X., Gao, H. and Yu, J., “Autonomous navigation system of indoor mobile robots using 2D lidar,” Mathematics 11(6), 1455 (2023). doi: 10.3390/math11061455.CrossRefGoogle Scholar
Qi, X., Wang, W., Liao, Z., Zhang, X., Yang, D. and Wei, R., “Object semantic grid mapping with 2D LiDAR and RGB-D camera for domestic robot navigation,” Appl. Sci. 10(17), 5782 (2020). doi: 10.3390/app10175782.CrossRefGoogle Scholar
Jiang, S., Wang, S., Yi, Z., Zhang, M. and Lv, X., “Autonomous navigation system of greenhouse mobile robot based on 3D Lidar and 2D Lidar SLAM,” Front. Plant Sci. 13(March), 118 (2022). doi: 10.3389/fpls.2022.815218.Google ScholarPubMed
Wang, Y. T., Peng, C. C., Ravankar, A. A. and Ravankar, A., “A single LiDAR-based feature fusion indoor localization algorithm,” Sensors (Switzerland) 18(4), 1294 (2018). doi: 10.3390/s18041294.CrossRefGoogle ScholarPubMed
Yilmaz, A. and Temeltas, H., “Self-adaptive Monte Carlo method for indoor localization of smart AGVs using LIDAR data,” Robot. Auton. Syst. 122, 103285 (2019). doi: 10.1016/j.robot.2019.103285.CrossRefGoogle Scholar
Liu, Y., Wang, C., Wu, H., Wei, Y., Ren, M. and Zhao, C., “Improved LiDAR localization method for mobile robots based on multi-sensing,” Remote Sens. 14(23), 6133 (2022). doi: 10.3390/rs14236133.CrossRefGoogle Scholar
Kumar, G. A., Patil, A. K., Patil, R., Park, S. S. and Chai, Y. H., “A LiDAR and IMU integrated indoor navigation system for UAVs and its application in real-time pipeline classification,” Sensors (Switzerland) 17(6), 1268 (2017). doi: 10.3390/s17061268.CrossRefGoogle ScholarPubMed
Nguyen, P. T. T., Yan, S. W., Liao, J. F. and Kuo, C. H., “Autonomous mobile robot navigation in sparse lidar feature environments,” Appl. Sci. 11(13), 5963 (2021). doi: 10.3390/app11135963.CrossRefGoogle Scholar
Zhang, X., Lai, J., Xu, D., Li, H. and Fu, M., “2D Lidar-based SLAM and path planning for indoor rescue using mobile robots,” J. Adv. Transp. 2020, 114 (2020). doi: 10.1155/2020/8867937.Google Scholar
Du, S., Chen, T., Lou, Z. and Wu, Y., “A 2D-LiDAR-based localization method for indoor mobile robots using correlative scan matching,” Robotica 43(2), 128 (2024). doi: 10.1017/S026357472400198X.Google Scholar
Ismail, H., Roy, R., Sheu, L. J., Chieng, W. H. and Tang, L. C., “Exploration-based SLAM (e-SLAM) for the indoor mobile robot using Lidar,” Sensors 22(4), 1689 (2022). doi: 10.3390/s22041689.CrossRefGoogle ScholarPubMed
Roy, R., Tu, Y. P., Sheu, L. J., Chieng, W. H., Tang, L. C. and Ismail, H., “Path planning and motion control of indoor mobile robot under exploration-based SLAM (e-SLAM),” Sensors 23(7), 3606 (2023). doi: 10.3390/s23073606.CrossRefGoogle ScholarPubMed
Ye, Q., Shi, P., Xu, K., Gui, P. and Zhang, S., “A novel loop closure detection approach using simplified structure for low-cost lidar,” Sensors (Switzerland) 20(8), 2299 (2020). doi: 10.3390/s20082299.CrossRefGoogle ScholarPubMed
Huang, Y. H. and Lin, C. T., “Indoor localization method for a mobile robot using liDAR and a dual aprilTag,” Electron 12(4), 1023 (2023). doi: 10.3390/electronics12041023.CrossRefGoogle Scholar
Xu, S., Chou, W. and Dong, H., “A robust indoor localization system integrating visual localization aided by CNN-based image retrieval with Monte Carlo localization,” Sensors (Switzerland) 19(2), 249 (2019). doi: 10.3390/s19020249.CrossRefGoogle ScholarPubMed
Wang, C., Wang, J., Li, C., Ho, D., Cheng, J., Yan, T., Meng, L. and Meng, M. Q., “Safe and robust mobile robot navigation in uneven indoor environments,” Sensors (Switzerland) 19(13), 120 (2019). doi: 10.3390/s19132993.Google ScholarPubMed
Yang, Y., Liu, J., Wang, W., Cao, Y. and Li, H., “Incorporating SLAM and mobile sensing for indoor CO2 monitoring and source position estimation,” J. Clean. Prod. 291, 125780 (2021). doi: 10.1016/j.jclepro.2020.125780.CrossRefGoogle Scholar
Chen, C., Zhu, H., Wang, L. and Liu, Y., “A stereo visual-inertial SLAM approach for indoor mobile robots in unknown environments without occlusions,” IEEE Access 7, 185408185421 (2019). doi: 10.1109/ACCESS.2019.2961266.CrossRefGoogle Scholar
Singh, K. J., Kapoor, D. S., Thakur, K., Sharma, A., Nayyar, A., Mahajan, S. and Shah, M. A., “Map making in social indoor environment through robot navigation using active SLAM,” IEEE Access 10(November), 134455134465 (2022). doi: 10.1109/ACCESS.2022.3230989.CrossRefGoogle Scholar
Chewu, C. C. E. and Kumar, V. M., “Autonomous navigation of a mobile robot in dynamic indoor environments using SLAM and reinforcement learning,” IOP Confer. Ser. Mater. Sci. Eng. 402(1), 012022 (2018). doi: 10.1088/1757-899X/402/1/012022.CrossRefGoogle Scholar
Lin, J., Peng, J., Hu, Z., Xie, X., Peng, R. and ORB-SLAM, IMU and wheel odometry fusion for indoor mobile robot localization and navigation,” Acad. J. Comput. Inf. Sci. 3(1), 131141 (2020). doi: 10.25236/AJCIS.2020.030114.Google Scholar
Lee, G., Moon, B.-C., Lee, S. and Han, D., “Fusion of the SLAM with Wi-Fi-based positioning methods for mobile robot-based learning data collection, localization, and tracking in indoor spaces,” Sensors 20(18), 5182 (2020). doi: 10.3390/s20185182.CrossRefGoogle Scholar
Yan, Y. P. and Wong, S. F., “A navigation algorithm of the mobile robot in the indoor and dynamic environment based on the PF-SLAM algorithm,” Cluster Comput. 22(s6), 1420714218 (2019). doi: 10.1007/s10586-018-2271-3.CrossRefGoogle Scholar
Shamseldin, T., Manerikar, A., Elbahnasawy, M. and Habib, A., “SLAM-based pseudo-GNSS/INS localization system for indoor liDAR mobile mapping systems, 2018 IEEE/ION position, locat,” 2018 IEEE/ION Position, Locat. Navig. Symp. PLANS 2018 - Proc. 2018, 197208 (2018). doi: 10.1109/PLANS.2018.8373382.Google Scholar
Ren, J., Wu, T., Zhou, X., Yang, C., Sun, J., Li, M., Jiang, H., Zhang, A. and SLAM, “Path planning algorithm and application research of an indoor substation wheeled robot navigation system,” Electron 11(12), 1838 (2022). doi: 10.3390/electronics11121838.CrossRefGoogle Scholar
Zhao, J., Liu, S. and Li, J., “Research and implementation of autonomous navigation for mobile robots based on SLAM algorithm under ROS,” Sensors 22(11), 4172 (2022). doi: 10.3390/s22114172.CrossRefGoogle ScholarPubMed
Lin, H. Y. and Yeh, M. C., “Drift-free visual SLAM for mobile robot localization by integrating UWB technology,” IEEE Access 10(September), 9363693645 (2022). doi: 10.1109/ACCESS.2022.3203438.CrossRefGoogle Scholar
Dai, Y., “Research on robot positioning and navigation algorithm based on SLAM,” Commun. Mob. Comput 2022, 110 (2022). doi: 10.1155/2022/3340529.Google Scholar
Kim, P., Chen, J. and Cho, Y. K., “SLAM-driven robotic mapping and registration of 3D point clouds,” Autom. Constr. 89(January), 3848 (2018). doi: 10.1016/j.autcon.2018.01.009.CrossRefGoogle Scholar
Chen, Y., Tang, J., Jiang, C., Zhu, L., Lehtomäki, M., Kaartinen, H., Kaijaluoto, R., Wang, Y., Hyyppä, J., Hyyppä, H. and Zhou, H., “The accuracy comparison of three simultaneous localization and mapping (SLAM)-based indoor mapping technologies,” Sensors (Switzerland) 18(10), 3228 (2018). doi: 10.3390/s18103228.CrossRefGoogle ScholarPubMed
An, Z., Hao, L., Liu, Y. and Dai, L., “Development of mobile robot SLAM based on ROS,” Int. J. Mech. Eng. Robot. Res. 5(1), 4751 (2016). doi: 10.18178/ijmerr.5.1.47-51.Google Scholar
Liu, R., Zhang, J., Chen, S., Yang, T. and Arth, C., “Accurate real-time visual SLAM combining building models and GPS for mobile robot,” J. Real-Time Image Process 18(2), 419429 (2021). doi: 10.1007/s11554-020-00989-6.CrossRefGoogle Scholar
Singh, N., Choe, S. and Punmiya, R., “Machine learning based indoor localization using wi-fi RSSI fingerprints: An overview,” IEEE Access 9, 127150127174 (2021). doi: 10.1109/ACCESS.2021.3111083.CrossRefGoogle Scholar
Upadhyay, J., Rawat, A., Deb, D., Muresan, V. and Unguresan, M. L., “An rssi-based localization, path planning and computer vision-based decision making robotic system,” Electron 9(8), 115 (2020). doi: 10.3390/electronics9081326.Google Scholar
Shu, M., Chen, G. and Zhang, Z., “3D point cloud-based indoor mobile robot in 6-DoF pose localization using a Wi-Fi-aided localization system,” IEEE Access 9, 3863638648 (2021). doi: 10.1109/ACCESS.2021.3060760.CrossRefGoogle Scholar
Ayyalasomayajula, R., Arun, A., Wu, C., Sharma, S., Sethi, A. R., Vasisht, D. and Bharadia, D., “Deep Learning Based Wireless Localization for Indoor Navigation. In: Proc. Annu. Int. Conf. Mob. Comput. Netw., MOBICOM (2020) pp.214227. doi: 10.1145/3372224.3380894.CrossRefGoogle Scholar
Chan, P. Y., Chao, J. C. and Wu, R. B., “A wi-fi-based passive indoor positioning system via entropy-enhanced deployment of Wi-Fi sniffers,” Sensors 23(3), 1376 (2023). doi: 10.3390/s23031376.CrossRefGoogle ScholarPubMed
Lin, X., Gan, J., Jiang, C., Xue, S. and Liang, Y., “Wi-Fi-based indoor localization and navigation: A robot-aided hybrid deep learning approach,” Sensors 23(14), 115 (2023). doi: 10.3390/s23146320.Google ScholarPubMed
Kharmeh, S. A., Natsheh, E., Sulaiman, B., Abuabiah, M. and Tarapiah, S., “Indoor WiFi-beacon dataset construction using autonomous low-cost robot for 3D location estimation,” Appl. Sci. 13(11), 6768 (2023). doi: 10.3390/app13116768.CrossRefGoogle Scholar
Khanh, T. T., Nguyen, V. D., Pham, X. Q. and Huh, E. N., “Wi-Fi indoor positioning and navigation: A cloudlet-based cloud computing approach,” Human-Centr. Ccomput. Inf. Sci. 10(1), (2020). doi: 10.1186/s13673-020-00236-8.Google Scholar
Retscher, G., Gikas, V., Hofer, H., Perakis, H. and Kealy, A., “Range validation of UWB and wi-Fi for integrated indoor positioning,” Appl. Geomat. 11(2), 187195 (2019). doi: 10.1007/s12518-018-00252-5.CrossRefGoogle Scholar
Hashemifar, Z. S., Adhivarahan, C., Balakrishnan, A. and Dantu, K., “Augmenting visual SLAM with wi-fi sensing for indoor applications,” Auton Robots 43(8), 22452260 (2019). doi: 10.1007/s10514-019-09874-z.CrossRefGoogle Scholar
Yu, C., Lan, H., Gu, F., Yu, F. and El-Sheimy, N., “A map/INS/Wi-i integrated system for indoor location-based service applications,” Sensors (Switzerland) 17(6), 1272 (2017). doi: 10.3390/s17061272.CrossRefGoogle Scholar
Tan, J., Fan, X., Wang, S. and Ren, Y., “Optimization-based Wi-Fi radio map construction for indoor positioning using only smart phones,” Sensors (Switzerland) 18(9), 3095 (2018). doi: 10.3390/s18093095.CrossRefGoogle ScholarPubMed
Cui, W., Liu, Q., Zhang, L., Wang, H., Lu, X. and Li, J., “A robust mobile robot indoor positioning system based on Wi-Fi,” Int. J. Adv. Robot. Syst. 17(1), 110 (2020). doi: 10.1177/1729881419896660.CrossRefGoogle Scholar
Haxhibeqiri, J., Jarchlo, E. A., Moerman, I. and Hoebeke, J., “Flexible Wi-Fi communication among mobile robots in indoor industrial environments,” Mob. Inf. Syst. 2018, 119 (2018). doi: 10.1155/2018/3918302.Google Scholar
Amanatiadis, A., “A multisensor indoor localization system for biped robots operating in industrial environments,” IEEE Trans. Ind. Electron. 63(12), 75977606 (2016). doi: 10.1109/TIE.2016.2590380.CrossRefGoogle Scholar
de Blasio, G., Quesada-Arencibia, A., García, C. R., Molina-Gil, J. M. and Caballero-Gil, C., “Study on an indoor positioning system for harsh environments based on Wi-Fi and bluetooth low energy,” Sensors (Switzerland) 17(6), 1299 (2017). doi: 10.3390/s17061299.CrossRefGoogle Scholar
Sarcevic, P., Csik, D. and Odry, A., “Indoor 2D positioning method for mobile robots based on the fusion of RSSI and magnetometer fingerprints,” Sensors 23(4), 1855 (2023). doi: 10.3390/s23041855.CrossRefGoogle ScholarPubMed
Motroni, A., Buffi, A. and Nepa, P., “A survey on indoor vehicle localization through RFID technology,” IEEE Access 9, 1792117942 (2021). doi: 10.1109/ACCESS.2021.3052316.CrossRefGoogle Scholar
Oguntala, G., Abd-Alhameed, R., Jones, S., Noras, J., Patwary, M. and Rodriguez, J., “Indoor location identification technologies for real-time ioT-based applications: An inclusive survey,” Comput. Sci. Rev. 30, 5579 (2018). doi: 10.1016/j.cosrev.2018.09.001.CrossRefGoogle Scholar
Álvarez-Aparicio, C., Guerrero-Higueras, Á.M., Rodríguez-Lera, F. J., Clavero, J. G., Rico, F. M. and Matellán, V., “People detection and tracking using LIDAR sensors,” Robotics 8(3), 112 (2019). doi: 10.3390/robotics8030075.CrossRefGoogle Scholar
Wu, H., Tao, B., Gong, Z., Yin, Z. and Ding, H., “A standalone RFID-based mobile robot navigation method using single passive tag,” IEEE Trans. Autom. Sci. Eng. 18(4), 15291537 (2021). doi: 10.1109/TASE.2020.3008187.CrossRefGoogle Scholar
Kammel, C., Kogel, T., Gareis, M. and Vossiek, M., “A cost-efficient hybrid UHF RFID and odometry-based mobile robot self-localization technique with centimeter precision,” IEEE J. Radio Freq. Ident. 6, 467480 (2022). doi: 10.1109/JRFID.2022.3186852.CrossRefGoogle Scholar
Shangguan, L. and Jamieson, K., “The design and implementation of a mobile RFID tag sorting robot,” MobiSys 2016 - Proc. 14th Annu. Int. Conf. Mob. Syst. Appl. Serv., 3142 (2016). doi: 10.1145/2906388.2906417.CrossRefGoogle Scholar
Digiampaolo, E. and Martinelli, F., “A robotic system for localization of passive UHF-RFID tagged objects on shelves,” IEEE Sens. J. 18(20), 85588568 (2018). doi: 10.1109/JSEN.2018.2865339.CrossRefGoogle Scholar
Seco, F. and Jiménez, A. R., “Smartphone-based cooperative indoor localization with RFID technology,” Sensors (Switzerland) 18(1), 266 (2018). doi: 10.3390/s18010266.CrossRefGoogle ScholarPubMed
Mi, J. and Takahashi, Y., “An design of HF-band RFID system with multiple readers and passive tags for indoor mobile robot self-localization,” Sensors (Switzerland) 16(8), 120 (2016). doi: 10.3390/s16081200.CrossRefGoogle ScholarPubMed
Ye, H. and Peng, J., “Robot indoor positioning and navigation based on improved wiFi location fingerprint positioning algorithm,” Wirel. Commun. Mob. Comput. 2022, 113 (2022). doi: 10.1155/2022/8274455.Google Scholar
Da Mota, F. A. X., Rocha, M. X., Rodrigues, J. J. P. C., De Albuquerque, V. H. C. and De Alexandria, A. R., “Localization and navigation for autonomous mobile robots using petri nets in indoor environments,” IEEE Access 6, 3166531676 (2018). doi: 10.1109/ACCESS.2018.2846554.CrossRefGoogle Scholar
A., M., Yasuno, T., Suzuki, H., H., I. and M., S., “Indoor navigation system based on passive RFID transponder with digital compass for visually impaired people,” Int. J. Adv. Comput. Sci. Appl. 7(2), 604611 (2016). doi: 10.14569/ijacsa.2016.070276.Google Scholar
Demiral, E., Karas, A. R., Karakaya, Y. and Kozlenko, M., “Design of indoor robot prototype guided by rfid based positioning and navigation system,” Int. Arch. Photogram. Remote Sens. Spat. Inf. Sci. - ISPRS Arch. 46(4/W5-2021), 175180 (2021). doi: 10.5194/isprs-Archives-XLVI-4-W5-2021-175-2021.Google Scholar
Alarifi, A., Al-Salman, A., Alsaleh, M., Alnafessah, A., Al-Hadhrami, S., Al-Ammar, M. A. and Al-Khalifa, H. S., “Ultra wideband indoor positioning technologies: Analysis and recent advances,” Sensors (Switzerland) 16(5), 136 (2016). doi: 10.3390/s16050707.CrossRefGoogle ScholarPubMed
Gu, Y. and Ren, F., “Energy-efficient indoor localization of smart hand-held devices using bluetooth,” IEEE Access 3, 14501461 (2015). doi: 10.1109/ACCESS.2015.2441694.CrossRefGoogle Scholar
Kok, M., Hol, J. D. and Schon, T. B., “Indoor positioning using ultrawideband and inertial measurements,” IEEE Trans. Veh. Technol. 64(4), 12931303 (2015). doi: 10.1109/TVT.2015.2396640.CrossRefGoogle Scholar
Yao, L., Wu, Y. W. A., Yao, L. and Liao, Z. Z., “An Integrated IMU and UWB Sensor Based Indoor Positioning system. In: 2017 Int. Conf. Indoor Position. Indoor Navig. IPIN (2017) pp. 18. doi: 10.1109/IPIN.2017.8115911 CrossRefGoogle Scholar
Weinmann, K. and Simske, S., “Design of bluetooth 5.1 angle of arrival homing controller for autonomous mobile robot,” Robotics 12(4), 115 (2023). doi: 10.3390/robotics12040115.CrossRefGoogle Scholar
Naheem, K., Elsharkawy, A., Koo, D., Lee, Y. and Kim, M., “A UWB-based lighter-than-air indoor robot for user-centered interactive applications,” Sensors 22(6), 2093 (2022). doi: 10.3390/s22062093.CrossRefGoogle ScholarPubMed
Yadav, P. and Sharma, S. C., “A systematic review of localization in WSN: Machine learning and optimization-based approaches,” Int. J. Commun. Syst. 36(4), e5397 (2023). doi: 10.1002/dac.5397.CrossRefGoogle Scholar
Khodarahmi, M. Maihami, V., “A review on kalman filter models,” Comput. Methods Eng. 30(1), 727747 (2023). doi: 10.1007/s11831-022-09815-7.CrossRefGoogle Scholar
Tai, J., Liu, X., Wang, X., Shan, Y. and He, T., “An adaptive localization method of simultaneous two acoustic emission sources based on energy filtering algorithm for coupled array signal, mech,” Syst. Signal Process 154, 107557 (2021). doi: 10.1016/j.ymssp.2020.107557.CrossRefGoogle Scholar
CISA, SAFECOM, and NCSWIC, Radio frequency interference best practices guidebook, public saf,” Commun. – RF Interf. 2020, 112 (February 2020).Google Scholar
Zhang, Z., Lee, M. and Choi, S., “Deep-learning-based Wi-Fi indoor positioning system using continuous CSI of trajectories,” Sensors 21(17), 5776 (2021). doi: 10.3390/s21175776.CrossRefGoogle ScholarPubMed
Chen, K. M. and Chang, R. Y., “A Comparative Study of Deep-Learning-Based Semi-Supervised Device-Free Indoor Localization,” In: 2021 IEEE Global Communications Conference (GLOBECOM) (2021) 16. doi: 10.1109/GLOBECOM46510.2021.9685548.CrossRefGoogle Scholar
Lutakamale, A. S., Myburgh, H. C. and de Freitas, A., “A hybrid convolutional neural network-transformer method for received signal strength indicator fingerprinting localization in long range wide area network, eng,” Appl. Artif. Intell. 133, 108349 (2024). doi: 10.1016/j.engappai.2024.108349.CrossRefGoogle Scholar
Santoro, L., Nardello, M., Brunelli, D. and Fontanelli, D., “UWB-based indoor positioning system with infinite scalability,” IEEE Trans. Instrum. Meas. 72, 111 (2023). doi: 10.1109/TIM.2023.3282299.CrossRefGoogle Scholar
Asaad, S. M. and Maghdid, H. S., “A comprehensive review of indoor/Outdoor localization solutions in ioT era: Research challenges and future perspectives,” Comput. Netw. 212, 109041 (2022). doi: 10.1016/j.comnet.2022.109041.CrossRefGoogle Scholar
Hayward, S. J., van Lopik, K., Hinde, C. and West, A. A., “A survey of indoor location technologies, techniques and applications in industry,” Internet of Things 20, 100608 (2022). doi: 10.1016/j.iot.2022.100608.CrossRefGoogle Scholar
Brunello, A., Montanari, A. and Saccomanno, N., “A framework for indoor positioning including building topology,” IEEE Access 10, 114959114974 (2022). doi: 10.1109/ACCESS.2022.3218301.CrossRefGoogle Scholar
Merveille, F. F. R., Jia, B., Xu, Z. and Fred, B., “Advancements in sensor fusion for underwater SLAM: A review on enhanced navigation and environmental perception,” Sensors 24(23), 7490 (2024). doi: 10.3390/s24237490.CrossRefGoogle Scholar
Ullah, I., Adhikari, D., Khan, H., Ahmad, S., Esposito, C. and Choi, C., “Optimizing Mobile Robot Localization: Drones-Enhanced Sensor Fusion with Innovative Wireless Communication. In: IEEE INFOCOM. 2024 - IEEE Conference on Computer Communications Workshops (INFOCOM WKSHPS) (2024) pp. 16. doi: 10.1109/INFOCOMWKSHPS61880.2024.10620739.CrossRefGoogle Scholar
Granig, W., Faller, L. M., Hammerschmidt, D. and Zangl, H., “Dependability considerations of redundant sensor systems,” Reliab. Eng. Syst. Saf. 190(November), 106522 (2018). doi: 10.1016/j.ress.2019.106522 2019.CrossRefGoogle Scholar
Zhongna Zhou, , Wenqing Dai, , Eggert, J., Giger, J. T., Keller, J., Rantz, M., Zhihai He, , “A real-time system for in-home activity monitoring of elders,” Annu. Int. Conf. IEEE Eng. Med. Biol. Soc. IEEE Eng. Med. Biol. Soc. Annu. Int. Conf. 2009, 61156118 (2009). doi: 10.1109/IEMBS.2009.5334915.Google Scholar
Woolrych, R., “Challenges and opportunities of using surveillance technologies in residential care,” Innov. Aging 1(suppl_1), 725725 (2017). doi: 10.1093/geroni/igx004.2607.CrossRefGoogle Scholar
Conway, M., “Online extremism and terrorism research ethics: Researcher safety, informed consent, and the need for tailored guidelines, terror,” Polit. Violence 33(2), 367380 (2021). doi: 10.1080/09546553.2021.1880235.CrossRefGoogle Scholar
Setty, E., Hunt, J. and Ringrose, J., “Surveillance or support? Policing harmful sexual behaviour among young people in schools,” Child. Soc. doi: 10.1111/chso.12960-10.1111/chso.12960.CrossRefGoogle Scholar
Zakerabasali, S. and Ayyoubzadeh, S. M., “Internet of things and healthcare system: A systematic review of ethical issues,” Heal. Sci. Rep. 5(6), e863 (2022). doi: 10.1002/hsr2.863.CrossRefGoogle ScholarPubMed
Biondi, G., Cagnoni, S., Capobianco, R., Franzoni, V., Lisi, F. A., Milani, A. and Vallverdú, J., “Editorial: Ethical design of artificial intelligence-based systems for decision making,” Front. Artif. Intell. 6, 1250209 (2023). doi: 10.3389/frai.2023.1250209.CrossRefGoogle ScholarPubMed
Arikumar, K. S., Prathiba, S. B., Alazab, M., Gadekallu, T. R., Pandya, S., Khan, J. M. and Moorthy, R. S., “FL-PMI: federated learning-based person movement identification through wearable devices in smart healthcare systems,” Sensors 22(4), 1377 (2022). doi: 10.3390/s22041377.CrossRefGoogle ScholarPubMed
Gokulakrishnan, S., Jarwar, M. A., Ali, M. H., Kamruzzaman, M. M., Meenakshisundaram, I., Jaber, M. M. and Kumar, R. L., “Maliciously roaming person’s detection around hospital surface using intelligent cloud-edge based federated learning,” J. Comb. Optim. 45(1), 13 (2022). doi: 10.1007/s10878-022-00939-x.CrossRefGoogle Scholar
Mohammed, M. A., “Federated learning-driven ioT and edge cloud networks for smart wheelchair systems in assistive robotics federated learning-driven ioT and edge cloud networks for smart wheelchair systems in assistive robotics, Iraqi J. Comput. Sci. Math., 6(1), 9 (2025).10.52866/2788-7421.1241CrossRefGoogle Scholar
Bharati, S., Mondal, M. R. H., Podder, P. and Prasath, V. B. S., “Federated learning: Applications, challenges and future directions,” Int. J. Hybrid Intell. Syst. 18(1-2), 1935 (2022). doi: 10.3233/HIS-220006.Google Scholar
Zhu, C., Zhu, X., Ren, J. and Qin, T., “Blockchain-enabled federated learning for UAV edge computing network: Issues and solutions,” IEEE Access 10, 5659156610 (2022). doi: 10.1109/ACCESS.2022.3174865.CrossRefGoogle Scholar
Truong, N., Sun, K., Wang, S., Guitton, F. and Guo, Y. K., “Privacy preservation in federated learning: An insightful survey from the GDPR perspective,” Comput. Secur. 110, 102402 (2021). doi: 10.1016/j.cose.2021.102402.CrossRefGoogle Scholar
Ali, B. S., Ullah, I., Al Shloul, T., Khan, I. A., Khan, I., Ghadi, Y. Y., Abdusalomov, A., Nasimov, R., Ouahada, K. and Hamam, H., “ICS-IDS: Application of big data analysis in AI-based intrusion detection systems to identify cyberattacks in ICS networks,” J. Supercomput. 80(6), 78767905 (2024). doi: 10.1007/s11227-023-05764-5.CrossRefGoogle Scholar
Ullah, I., Adhikari, D., Su, X., Palmieri, F., Wu, C. and Choi, C., “Integration of data science with the intelligent ioT (IIoT): Current challenges and future perspectives,” Digit. Commun. Netw. 11(2), 280298 (2024). doi: 10.1016/j.dcan.2024.02.007.CrossRefGoogle Scholar
Ullah, I., Noor, A., Nazir, S., Ali, F., Ghadi, Y. Y. and Aslam, N., “Protecting ioT devices from security attacks using effective decision-making strategy of appropriate features,” J. Supercomput. 80(5), 58705899 (2024). doi: 10.1007/s11227-023-05685-3.CrossRefGoogle Scholar
Stavroulakis, P. and Stamp, M., “Handbook of Information and Communication Security (2010). doi: 10.1007/978-3-642-04117-4.CrossRefGoogle Scholar
Figure 0

Table I. Summary of standard performance metrics for IPS [44–47].

Figure 1

Figure 1. (A) The development of infrastructure-less navigation for healthcare logistics, taken from ref. [52], with the permission of IEEE. (B) Binocular vision and IMU-based system for GPS-denied environments, taken from ref. [53], copyright sage publication. (C) indoor mobile robots for navigation positioning, replicated from ref. [54], copyright Sage Publication. (D) IMU system-based indoor robots for infrastructure-independent localization, taken from ref. [55], Copyright Elsevier. (E) Low- and medium-cost IMUs for automated guided vehicles for cost-effective navigation in industrial applications, taken from ref. [56], Copyright Elsevier. (F) IMU-based system for trajectories in GPS-denied environments, taken from ref. [57], Copyright MDPI.

Figure 2

Table II. Comparative analysis of IMU-based localization and application case studies.

Figure 3

Figure 2. (A) VLP system for mobile robots for dynamic indoor environments, taken from ref. [65], Copyright Hindawi. (B) VLC-based localization system for indoor navigation, taken from ref. [66], copyright arXiv. (C) Two-layer fusion network spanning industrial automation and smart buildings, taken from ref. [68], copyright IEEE. (D) VLC-based autonomous delivery robot to improve hospital safety and navigation, taken from ref. [69], copyright IEEE. (E) VLC-based positioning system for mobile robots in nuclear power plants, taken from ref. [70], copyright axXiv.

Figure 4

Table III. Comprehensive analysis of VLC-based indoor robotics systems.

Figure 5

Figure 3. (A) Indoor localization system using flickering infrared LEDs and bio-inspired sensors suitable for GPS-denied environments like indoor robotic applications, taken from ref. [77], Copyright MDPI. (B) Swarm of autonomous robots for complex indoor settings, taken from ref. [78], Copyright MDPI. (C) Mobile robotics based on ultrasonic and UWB technologies for indoor localization, taken from ref. [79], copyright MDIP. (D) High-accuracy ultrasonic indoor positioning system (UIPS) based on wireless sensor networks, taken from ref. [80], copyright MDIP.

Figure 6

Table IV. Comparative study of indoor localization systems based on IR.

Figure 7

Figure 4. (A) LiDAR-based SLAM system for autonomous robots, taken from ref. [91], copyright frontiers. (B) LiDAR-based robust for pose estimation in clean and perturbed environments [92], copyright MDPI. (C) self-adaptive Monte Carlo Localization algorithm tailored for smart automated guided vehicles position tracking, and kidnapping scenarios, taken from ref. [93], copyright elsevier. (D) LiDAR localization method leveraging multi-sensing data from IMU, odometry, and 3D LiDAR for complex indoor spaces, taken from ref. [94], copyright MDPI. (E) LiDAR and IMU integration for UAV indoor navigation, taken from ref. [95], copyright MDPI.

Figure 8

Table V. LiDAR research overview.

Figure 9

Table VI. Comparative analysis of liDAR, VLC, and IR systems [28, 64–90].

Figure 10

Figure 5. (A) SLAM framework that relies exclusively on liDAR sensors for indoor mobile robot navigation, taken from ref. [100], copyright MDPI. (B) Indoor environmental monitoring, taken from ref. [105], copyright Elsevier. (C) STCM-SLAM for precise pose estimation, taken from ref. [106], copyright IEEE. (D) SLAM-based navigation systems for environments populated with humans, taken from ref. [107], copyright IEEE. (E) SLAM-based 3D OctoMap navigation system for complex 3D environments, taken from ref. [104], copyright MDPI.

Figure 11

Table VII. Summarizing key details like focus, strengths, limitations, key techniques, applications, and overarching trends.

Figure 12

Table VIII. Provides a comparative performance analysis of various positioning technologies [28, 39, 40, 48].

Figure 13

Figure 6. (A) A multimodal approach combining 3D point clouds and Wi-Fi signals to achieve pose estimation for mobile robots was taken from ref. [123], copyright IEEE. (B) Deep learning-based system that pairs neural networks with MapFind, an autonomous mapping platform, taken from ref. [124]. (C) Wi-Fi-based indoor positioning system, taken from ref. [125], copyright MDPI. (D) wi-fi RSSI-based indoor Robots for obstacle-rich environments, taken from ref. [126], copyright MDPI. (E) 3D Wi-Fi localization using low-cost robots for large-scale deployments, taken from ref. [127], copyright MDPI.

Figure 14

Table IX. Comparative analysis for wi-fi-based indoor localization techniques.

Figure 15

Figure 7. (A) RFID-guided robot prototype for structured environments, taken from ref. [140]. (B) RFID-based standalone navigation method, taken from ref. [141], copyright IEEE. (C) RFID and odometry for centimeter-level localization robustness in warehouse environments, taken from ref. [142], copyright IEEE. (D) RFID-tagged items in dense environments, taken from ref. [143]. (E) RFID-based indoor robot for detecting items localized in shelves, taken from ref. [144], copyright IEEE.

Figure 16

Table X. Comparison of RFID-related research papers.

Figure 17

Table XI. Comparative analysis of RF-based IPS methods [27, 29, 42, 155].

Figure 18

Table XII. Deep learning models for signal correction and multipath mitigation [161, 162].

Figure 19

Figure 8. Challages and future direction.