iBet uBet web content aggregator. Adding the entire web to your favor.
iBet uBet web content aggregator. Adding the entire web to your favor.



Link to original content: https://doi.org/10.3390/s24072095
A Performance Comparison between Different Industrial Real-Time Indoor Localization Systems for Mobile Platforms
Next Article in Journal
The Difference in the Assessment of Knee Extension/Flexion Angles during Gait between Two Calibration Methods for Wearable Goniometer Sensors
Next Article in Special Issue
A High-Resolution Time Reversal Method for Target Localization in Reverberant Environments
Previous Article in Journal
Application of Response Surface-Corrected Finite Element Model and Bayesian Neural Networks to Predict the Dynamic Response of Forth Road Bridges under Strong Winds
Previous Article in Special Issue
Simplified Indoor Localization Using Bluetooth Beacons and Received Signal Strength Fingerprinting with Smartwatch
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Performance Comparison between Different Industrial Real-Time Indoor Localization Systems for Mobile Platforms

1
Institute for Systems and Computer Engineering, Technology and Science (INESC TEC), 4200-465 Porto, Portugal
2
School of Sciences and Technology-Engineering Department (UTAD), 5000-801 Vila Real, Portugal
3
CeDRI, SusTEC, Instituto Politécnico de Bragança, Campus Sta Apolónia, 5300-253 Bragança, Portugal
4
Institute of Electronics and Informatics Engineering of Aveiro (IEETA), University of Aveiro, 3810-193 Aveiro, Portugal
5
Intelligent Systems Associate Laboratory (LASI), University of Minho, 4800-058 Guimarães, Portugal
6
Faculty of Engineering, University of Porto (FEUP), 4200-465 Porto, Portugal
*
Author to whom correspondence should be addressed.
Sensors 2024, 24(7), 2095; https://doi.org/10.3390/s24072095
Submission received: 15 January 2024 / Revised: 11 March 2024 / Accepted: 15 March 2024 / Published: 25 March 2024
(This article belongs to the Collection Sensors and Systems for Indoor Positioning)

Abstract

:
The flexibility and versatility associated with autonomous mobile robots (AMR) have facilitated their integration into different types of industries and tasks. However, as the main objective of their implementation on the factory floor is to optimize processes and, consequently, the time associated with them, it is necessary to take into account the environment and congestion to which they are subjected. Localization, on the shop floor and in real time, is an important requirement to optimize the AMRs’ trajectory management, thus avoiding livelocks and deadlocks during their movements in partnership with manual forklift operators and logistic trains. Threeof the most commonly used localization techniques in indoor environments (time of flight, angle of arrival, and time difference of arrival), as well as two of the most commonly used indoor localization methods in the industry (ultra-wideband, and ultrasound), are presented and compared in this paper. Furthermore, it identifies and compares three industrial indoor localization solutions: Qorvo, Eliko Kio, and Marvelmind, implemented in an industrial mobile platform, which is the main contribution of this paper. These solutions can be applied to both AMRs and other mobile platforms, such as forklifts and logistic trains. In terms of results, the Marvelmind system, which uses an ultrasound method, was the best solution.

1. Introduction

To be more competitive, flexible, and productive, nowadays, all companies are modeling and investing in their factory floors. The industry is experiencing a new era of high technological development, so the integration of mobile platforms in industrial processes and tasks is increasingly common. Currently, all companies are focused on developing industrial systems that are fully automated and more flexible [1]. This flexibility makes them suitable to be used in different industrial stages or environments.
Autonomous guided vehicles (AGV) are normally mobile platforms to transport materials between workstations or warehouses, without guidelines like the magnetic lines, and their increased use in shop floors is related to their robustness and flexibility [2], contributing to the increase of efficiency and effectiveness of the production process.
In modern industries, AMR systems are a very attractive solution to increase the level of automation in factory logistics [3], so the use of them has become widespread in the last few decades.
The latest and most modern industries have integrated a hybrid system of mobile platform types on their factory floors. Manual platforms, such as logistic trains and forklift trucks, continue to perform their tasks, but with the help of autonomous mobile platforms, the well-known mobile robots. To have a good interaction between both, especially in path management, all these platforms must know their position and orientation so that the movement is safe and smooth [4], this is the main goal of the localization systems.
Indoor localization is a key technology for mobile platforms (MP) [5], such as AMR, as it enables the platform to determine its position and orientation within an indoor environment. There are several approaches to indoor localization, each with its own set of advantages and disadvantages.
As another important point, there are various industrial environments and their characteristics that must be taken into consideration to have an efficient localization robot system. Sometimes, a common solution is to use more than one of the technologies mentioned above and apply a sensor fusion algorithm.
The exchange between localization systems in real time, and consequently, the exchange of maps and trajectories, allows the robot to obtain greater stability in estimating its pose as well as in smoothing its movement on the factory floor. These data, of AMRs’ pose, are also very important for the robot fleet management algorithm.
This study is a module of a project aiming to integrate autonomous mobile robots (AMR) within logistical trains. To effectively plan the routes for the AMR, it is imperative to have accurate information about the positions of the logistic trains.
Considering the previous assumptions, this paper discusses the integration, in an industrial environment, and the comparison of different indoor localization systems for mobile platforms. The main goal is the selection of a real-time location system to be implemented in all mobile platforms of an industrial production line. The detection of the real location of each vehicle will allow the AMR fleet management software to plan more efficiently and accurately the AMRs paths, reducing conflicts between different mobile platforms on the shop floor (AMRs, forklifts, logistical trains, others).
In Section 2, the state-of-the-art indoor localization systems, technologies, and techniques are presented. Section 3 describes the most popular localization technologies and techniques usually implemented in indoor environments for locating objects and/or persons. Section 4 presents the systematic framework employed to address the research objectives. Section 5 presents the comparison results achieved with different indoor localization systems, in an industrial scenario and with an AMR. In this specific section, it is possible to compare and analyze the results obtained from the different industrial localization systems and their comparison with ground truth. Finally, some conclusions and the contribution of this research are presented in Section 6.
This paper stands out from the vast majority of papers in the literature on indoor localization, as it compares three industrial systems on the market in a quantitative way, with tests carried out in a real environment, making it easier to choose for future integration.

2. Related Work

Indoor localization refers to the process of determining the location of a device or a person inside a building or an enclosed space [6]. It is an important technology that has numerous applications in various industries and sectors, including retail, healthcare [7], transportation [8], industrial automation [9], public safety [10], and entertainment [11].
The importance of indoor localization lies in the fact that it enables businesses and organizations to better understand and optimize the movement and behavior of people and assets within their premises [12]. For example, indoor localization can help a retailer track customer movement and engagement in its store, a healthcare facility monitor the location and status of its medical equipment and staff, a transportation company optimize the delivery of packages and goods [13], and a factory automate [14] and monitor the production process [15].
There are different types of indoor localization systems, each using various technologies and methods to determine the location of a device or a person. Some common technologies used in indoor localization include radio frequency (RF) signals, wireless technologies (e.g., WiFi, Bluetooth), ultrasonic (US) signals, and infrared (IR) signals. Geometric methods [16] such as trilateration and triangulation are often used to calculate the position of a device based on the distance to multiple reference points [17]. Probabilistic methods such as Kalman filters and particle filters are also used to estimate the location based on statistical models and sensor data [14,18].
Regarding AMRs, nowadays, in industry, some technologies can be applied for autonomous mobile platforms localization like the indoor/outdoor Global Positioning System (GPS) [19,20], 2D/3D sensors, vision systems, and wireless technologies like radio frequency identification (RFID) tags [21,22] or barcodes [23,24,25]. They have different characteristics, namely the accuracy, and therefore, it is important to take into account the purpose of each robot.
In recent years, machine learning (ML) techniques [26,27,28,29] have also been applied to indoor localization to improve the accuracy and adaptability of the systems.
Concerning the transportation and logistics sectors, indoor localization systems can be used in warehouses, distribution centers, and other transportation and logistics environments to track the movement and status of packages [30], vehicles, and personnel. This can improve the efficiency and accuracy of package delivery [31] and inventory management [32], as well as reducing the risk of accidents and errors. For example, an indoor localization system can help a warehouse worker locate a specific package or pallet more quickly, or alert a driver when they are approaching a restricted area.
In industrial automation and smart factories, this type of localization system can be used in factories and other industrial environments to automate and monitor the production process. This can improve the efficiency, quality, and safety of manufacturing operations [33], as well as enable the use of advanced technologies such as robotics [34] and the Internet of Things (IoT) [35]. For example, an indoor localization system can be used to track the location and status of manufacturing equipment and materials, or to guide autonomous vehicles and robots through the factory [18].
One common approach is to use a fixed infrastructure, such as a network of stationary beacons or sensors [36,37], to determine the platform’s position. The platform can use these beacons or sensors to triangulate its position based on the strength of the signals it receives from each beacon [17]. This approach is relatively simple and accurate, but it requires the installation and maintenance of a fixed infrastructure, which may not be practical in all situations [36].
Another approach is to use computer vision techniques [38] to localize the robot or other platform. This can involve using visual features such as roofs [39], corners [40], edges, textures, or fiducial markers [41] in the environment to determine the robot’s position and orientation. This approach is generally more flexible and can work in a variety of environments, but it may be less accurate than other methods, particularly in cluttered or poorly lit environments.
Another option is to use Inertial Measurement Units (IMUs) to determine the platform’s position and orientation [42]. IMUs are sensors that measure acceleration and angular velocity and can be used in mobile robotics to track platform movement over time when integrated with other robot localization systems. This approach is relatively simple and can work in a variety of environments, but it may be prone to drift over time when used standalone, leading to errors in the platform’s position estimates [43].
Indoor Global Positioning System (GPS) systems are specialized versions of the GPS and are designed to work in indoor environments, where traditional GPS signals may be weak or unavailable [44]. These systems use a combination of technologies, such as Wi-Fi, Bluetooth, or ultra-wideband (UWB), to determine the location of a device or platform within an indoor space [45]. These systems are generally more accurate than a traditional GPS when used in indoor environments [46], but their accuracy can vary depending on the specific technologies and infrastructure used.
Finally, some indoor localization systems use a combination of these approaches, combining the strengths of different methods to achieve the best possible accuracy and flexibility [45]. For example, a robot may use a fusion algorithm to improve its accuracy and robustness in different environments [47]. However, all these types of localization systems, techniques, and methods are always subject to propagation problems and reflections of the signal itself, culminating in delays in its detection and subsequent errors in locating the object or person. In the chapter on data analysis, Section 5, it will be possible to validate these phenomena in the different indoor localization systems tested.
To the authors’ knowledge, so far, there is no article with the practical and implemented comparison of localization systems as highlighted in this article.

3. Localization Techniques and Methods

This section is divided into two subsections. The first discusses some of the most commonly used methodologies for obtaining the position of a given object or person, in different environments and in real time. The second subsection lists, describes, and characterizes some of the technologies that can be found in localization systems.

3.1. Localization Techniques

Several different methods can be used for indoor localization, each with its strengths and limitations. These methods can be broadly classified into three categories: Trilateration, Fingerprinting, and Dead Reckoning. This subsection will present three methods—time of flight (ToF), angle of arrival (AoA), and time difference of arrival (TDoA)—of the trilateration category.

3.1.1. Time of Flight (ToF)

ToF, or time of arrival (ToA), is a method for measuring the distance between two radio transceivers [48]. It uses the signal propagation time ( Δ t ), between the transmitter ( T x ) and the receiver ( R x ), to determine the distance between them. The ToF value multiplied by the signal velocity (v) provides the physical distance ( D i j ) between T x and R x (see Equation (1)).
D i j = ( t 2 t 1 ) × v = Δ t × v
where t 1 is the time when T x , in pose i, sends a message to the R x , in pose j. The last receives the signal at t 2 , where t 2 = t 1 + Δ t ( Δ t is the time taken, by the signal, between T x and R x ). So, the distance between the i and j, D i j , can be calculated by Equation (1), where v represents the speed of the signal.
The principal requirement of the ToF method is the synchronization between transmitters and receivers. The signal bandwidth and the sampling rate affect the system accuracy, where a low sampling rate (in time) reduces the ToF resolution. In industrial indoor environments, this type of method may have significant localization errors caused by the obstacles, that deflect the emitted signals from the transmitter to the receiver.

3.1.2. Angle of Arrival (AoA)

Using multiple receiver antennas [49], more commonly known as antenna arrays, it is possible to estimate the angle at which the transmitted signal impinges on the receivers; see Figure 1.
The AoA approach uses this angle α i and the antenna positions ( x i , y i ) , which are known in advance, to estimate and determine the two-dimensional (2D) ( x , y ) or three-dimensional (3D) ( x , y , z ) position of a transmitter. These data can be used for tracking or navigation purposes. Equation (2) represents the generic principle to obtain the object position by the AoA method.
x = d i × c o s ( α i ) + x i y = d i × s i n ( α i ) + y i
and i is the antenna number 1, 2, or 3.
While the distance between the transmitter and receiver increases, the two best-known features of AoA are the accuracy deterioration of the transmitter’s estimated position and the hardware is much more expensive and complex than in other techniques.

3.1.3. Time Difference of Arrival (TDoA)

This method measures the difference in TOA at two or more different sensors, in other words, it exploits the relative position of a mobile transmitter based on the different signal propagation times of the transmitter and the multiple receivers. To calculate the perfect location of a transmitter is required, at least, three receivers and a strict synchronization between them [50]. Unlike ToF techniques where synchronization is needed between the transmitter and the receiver, in TDoA, only synchronization between the receivers is required. The signal bandwidth, the sampling rate, and a nondirect line of sight between the transmitter and the receivers will affect the accuracy of the system.

3.2. Localization Methods

Indoor localization refers to the use of methods to determine the location of a device or person inside a building or other enclosed structure. Several different technologies can be used for indoor localization. So in this subsection, will be presented two high-tech and industrial technologies: ultra-wideband (UWB), and ultrasound (US).

3.2.1. Ultra-Wideband (UWB)

The radio signals can penetrate a variety of materials, although metals and liquids can interfere with it. So, this immunity to interference from other signals makes the ultra-wideband very attractive for indoor localization [51]. This radio technology can enable the very accurate measure of the ToF, leading to centimeter accuracy distance/location measurement. This system features two methods: passive and active. The first one does not use a UWB tag and takes advantage only of the signal reflection to obtain the object or person’s position. In this specific case, it is necessary to know, in advance, where the system transmitters and receivers are located, to later be able to calculate where the object or person is, through its intersection in the signals sent and received between transmitters and receivers. On the other hand, an active UWB-based positioning system makes use of a battery-powered UWB tag. In this case, the system locates and tracks the tag, in indoor environments, by transmitting ultra-short UWB pulses from it to the fixed UWB sensors. The sensors send the collected data, via a wireless network, to the software platform, which then analyses, computes, and displays the position of the UWB tag in real-time. Furthermore, the application of UWB in indoor environments has the advantages of long battery life for UWB tags, robust flexibility, high data rates, high penetrating power, low power consumption and transmission, good positioning accuracy and performance, and little or no interference and multipath effects. In addition, UWB is expensive to scale because of the need to deploy more UWB sensors in a wide coverage area to improve performance.

3.2.2. Ultrasound (US)

Mostly supported by the ToF technique, US localization technology [52] calculates the distance between tags and nodes using sound velocity and ultrasound signals. Though the sound velocity can vary with atmospheric or weather conditions, factors such as humidity and temperature affect its propagation. However, the implementation of specific filter algorithms, based on complex signal processing, can reduce the environmental noise and consequently increase the localization accuracy. To provide system synchronization, usually, the ultrasound signal is supplemented by radio frequency (RF) pulses.

4. Methodology

This section serves as a comprehensive guide to the research design, data collection methods, and analytical techniques utilized to ensure the validity, reliability, and robustness of paper findings. By transparently outlining the steps taken to gather and analyze data, this section is structured to provide a clear understanding of the research process, allowing for a critical assessment of the study’s methodology and its implications for the interpretation of results. Section 4.1, named Testing Scenario and Indoor Localization Systems, presents the industrial indoor scenario where all the tests were developed and the three used indoor localization systems. Section 4.2, called Data Acquisition, presents the original data acquired from each localization system, and the last, called Data Transformation or Section 4.3, addresses the conversion of the points obtained in the various indoor systems to the robot’s referential.

4.1. Testing Scenario and Indoor Localization Systems

Nowadays, in mobile robotics, the map is given by natural markers/contours of the environment, however, for these tests, beacons with a high reflection rate were used, represented by the brown circles in Figure 2. They assume always the same position, on the factory floor, so this allows the mobile platform localization system to compare, in real time, the previous 2D beacon location, saved in a file, and the live position, which is given by the reflection of the security laser waves.
In this case, the robot localization system only gives relevance to the beacon position and odometry to estimate its position and orientation. The robot localization system needs to see at least two beacons, represented by red circles inside the beacons circles, to determine the exact robot location, with this only giving relevance to odometry.
The trajectory is composed of waypoints/vertices, blue circles, and edges, which are assumed to be the connecting paths between the vertices and through which the autonomous mobile robot moves, the orange splines. All these edges are bidirectional, so the AMR can move to both sides. Both vertices and edges are associated with a specific ID number, which is randomly assigned by the trajectory editor module, giving only relevance to the fact that each vertex and each edge has a unique identification number.
Indoor localization systems can be used for a variety of purposes, such as improving navigation, providing location-based services, and tracking the movements of people or objects within a building.
Several factors can affect the accuracy and reliability of indoor localization systems, including the type of technology used, the layout and environment of the building, and the accuracy of the underlying maps or reference points. To achieve reliable and accurate indoor localization, it is often necessary to use a combination of different technologies and techniques and to carefully calibrate and maintain the system.
Overall, indoor localization systems are an important tool for improving the efficiency, safety, and experience of people inside buildings, and have many potential applications in a wide range of industries.
The next subsections introduce and attend to the three industrial localization systems (Qorvo, Eliko Kio, and Marvelmind) used for the comparison announced by this paper. One last indoor localization system will be presented, the extended Kalman filter (EKF) beacons, which is considered the test’s ground truth.

4.1.1. Qorvo

Qorvo’s ultra-wideband technology [53], supported by Decawave’s Impulse Radio, allows for the location of tags in indoor environments (Figure 3), with high precision and at a very low cost compared to other solutions on the market, such as Pozyx [54]. Other main features of this system are secure low-power and low-latency data communication.

4.1.2. Eliko Kio

The KIO system, developed by Eliko [55], is intended for 2D/3D indoor positioning of mobile UWB tags (Figure 4) in relation to fixed position UWB anchors. Based on the time of flight measurements of radio pulses traveling between tags and anchors, the 2D location consists of at least three anchors and one mobile tag. With regard to the 3D location, the KIO system needs one more anchor. Due to the low intensity of emitted radio signals, KIO devices could be used for human tracking, but the positioning frequency decreases when the number of active tags increases.

4.1.3. Marvelmind

The indoor positioning system by Marvelmind robotics [56], Figure 5, uses ultrasound ranging to find the position of one or more mobile sensor modules, also known as hedgehogs. Ultrasound ranging is also used by beacons, the transmitters, to determine their relative position. Therefore, the Marvelmind system is self-calibrating and the sensor modules have built-in rechargeable batteries. By the application programming interface (API), it is possible to choose if a module is a beacon or a hedgehog, which allows for greater system flexibility. The maximum update rate for tracking a single hedgehog is 16 Hz. However, in addition to ultrasonics, Marvelmind may also incorporate other communication technologies for data transmission and communication between beacons and tracked objects. Bluetooth and radio frequency communication are commonly used in conjunction with ultrasonics to enhance the capabilities of indoor positioning systems.

4.1.4. EKF Beacons—Ground Truth

Table 1 shows a small comparison between the different systems. All the data were taken from their datasheets.
The high intensity/reflection of these beacons (Figure 6) and the large number of samples present on the factory floor gives the robot localization system excellent accuracy and repeatability, making it the ground truth of these tests. So, in Table 2, it is possible to see the comparison between the robot position, given by its localization system, in each vertex, and the vertex position values presented in the trajectory data file. All these values were taken based on the robot map referential, which typically refers to the coordinate system or frame of reference used by a robot to represent and navigate within its environment. This referential is crucial for the robot to understand its position, orientation, and movement relative to the surrounding space.
Regarding the integration of indoor localization systems, in industry, has the potential to improve productivity, efficiency, and safety, as well as to create new opportunities for innovation and value creation. Therefore, the next subsection exhibited the implementation of the different systems listed before, either on the AMR or on the industrial scenario.

4.1.5. Industrial Scenario—Systems Integration

To cover the whole robot map area with the different indoor localization systems, some preliminary tests were carried out that allowed us to conclude the data present in Table 3.
Figure 7, supported by Figure 2, highlights the distribution of the different localization systems across the plant floor, the colored rectangles distributed in the image, as well as the robot trajectories and the beacon map. A trajectory that spans the entire range of action of the different indoor localization systems was scaled to best evaluate them, because the greater the distance of the robot to them, the greater the error associated with the robot’s position.
In the previous figure, the Eliko Kio system is illustrated by the four red rectangles, two of them at the center of the image and the others on each side. Regarding the Marvelmind system, it was possible to cover all the scene areas with only four tags, illustrated by the yellow rectangles. The only system where it was essential to use another tag was the Qorvo system, exemplified by five green rectangles, where four of them have a similar position to Marvelmind system tags. All these tags, regardless of the location system they are associated with, are both in the same referential.

4.1.6. Autonomous Mobile Robot—Systems Integration

On the mobile platform, as it is possible to see in Figure 8, each tag has a specific position and all of them were powered by a portable power bank.
To be able to compare the positions obtained by the ground truth and the different indoor localization systems, it was necessary to match the positions of the sick laser with each of the onboard tags. In this way, Table 4 represents the transformations of beacons data, in each vertex as described in Table 2 on robot row, to the three localization systems used in this case. These conversions were based on Equations (3) and (4).
X N e w = c o s ( T h e t a B e a c o n s ) × Δ X + X B e a c o n s
where X N e w corresponds to the new X value of the new point, X B e a c o n s represents the X value of the original point, and Δ X is the modulus of the difference between the last two values. The last parameter of the equation, T h e t a B e a c o n s , assumes the angle, in radians, between the robot referential and the map referential in the original point.
Y N e w = s i n ( T h e t a B e a c o n s ) × Δ X + Y B e a c o n s
here, Y N e w corresponds to the new Y value of the new point, and Y B e a c o n s represents the Y value of the original point. The last two parameters, T h e t a B e a c o n s and Δ X , are the same exposed in the last equation because the sensors were aligned by the X referential.

4.2. Data Acquisition

This subsection exposes the robot pose received by each localization system in each vertex from the robot’s map and their correspondence to the beacon data.

4.2.1. Beacons Data

Table 5 shows the average, the standard deviation, and the maximum and minimum values of the AMR localization system in each vertex.

4.2.2. Marvelmind

Table 6 shows the average, the standard deviation, and the maximum and minimum values of the Marvelmind localization system in each vertex.
In Figure 9, it is possible to see the correspondence points between the converted beacons data to the Marvelmind robot tag position (blue points) and the received original Marvelmind data (red circles).

4.2.3. Eliko Kio

Table 7 shows the average, the standard deviation, and the maximum and minimum values of the Eliko Kio localization system in each vertex.
In Figure 10, it is possible to see the correspondence points between the converted beacons data to the Eliko Kio robot tag position (blue points) and the received original Eliko Kio data (red circles).

4.2.4. Qorvo

Table 8 shows the average, the standard deviation, and the maximum and minimum values of the Qorvo localization system in each vertex.
In Figure 11, it is possible to see the correspondence points between the converted beacons data to the Qorvo robot tag position (blue points) and the received original Qorvo data (red circles).

4.3. Data Transformation

This subsection will present the transformation matrix, which concerns the conversion of the points in each location system’s referential to their coordinates in the robot’s map referential. It will also be possible to observe, through figures, the approximation of the original points of each location system to the respective values acquired by ground truth.
After acquiring the various sets of points, in the different references, the next calculation would be to calculate the respective transform between them (Marvelmind to ground truth, Eliko Kio to ground truth, and Qorvo to ground truth). Based on the least squares (LS) approximation and with the help of MatLab, the following transformation matrices and respective errors, in each of the coordinates, were obtained.

4.3.1. Marvelmind

Equations (5)–(7) represent the transformation matrix from the Marvelmind referential to the Beacons referential, the ground truth referential.
2 D T r a n s l a t i o n M a t r i x = 1.6779 4.9422
2 D R o t a t i o n M a t r i x = 0.3206 0.9472 0.9472 0.3206
A n g l e = 252.535 °
In Figure 12, it is possible to see the aligned points to the Marvelmind localization system. The blue points represent the converted robot position points, acquired from the robot location system, to the Marvelmind robot tag position. The red circles are the original robot location points, acquired from the Marvelmind system, transformed to the robot location system referential; see Table 9. Comparing these to types of points, after the conversion, it is possible to obtain the coordinate errors exposed in Table 10.

4.3.2. Eliko Kio

Equations (8)–(10) represent the transformation matrix from the Eliko Kio referential to the Beacons referential, the ground truth referential.
2 D T r a n s l a t i o n M a t r i x = 1.3654 5.0303
2 D R o t a t i o n M a t r i x = 0.3060 0.9520 0.9520 0.3060
A n g l e = 253.213 °
In Figure 13, it is possible to see the aligned points to the Eliko Kio localization system. The blue points represent the converted robot position points, acquired from the robot location system, to the Eliko Kio robot tag position. The red circles are the original robot location points, acquired from the Eliko Kio system, transformed to the robot location system referential; see Table 11. Comparing these to types of points, after the conversion, it is possible to obtain the coordinate errors exposed in Table 12.

4.3.3. Qorvo

Equations (11)–(13) represent the transformation matrix from the Qorvo referential to the Beacons referential, the ground truth referential.
2 D T r a n s l a t i o n M a t r i x = 1.8242 4.6445
2 D R o t a t i o n M a t r i x = 0.3297 0.9441 0.9441 0.3297
A n g l e = 252.609 °
In Figure 14, it is possible to see the aligned points to the Qorvo localization system. The blue points represent the converted robot position points, acquired from the robot location system, to the Qorvo robot tag position. The red circles are the original robot location points, acquired from the Qorvo system, transformed to the robot location system referential; see Table 13. Comparing these to types of points, after the conversion, it is possible to obtain the coordinate errors exposed in Table 14.

5. Results

In the results section of this paper, the research outcomes are unveiled, providing a detailed account of the data obtained through meticulous analysis. This section serves as a culmination of the study’s investigative efforts, presenting a comprehensive depiction of the key findings about the research questions and objectives outlined earlier.
After presenting the results obtained with the different indoor localization systems and comparing the respective error values in each map point with the ground truth used (see Table 15), it can be stated that the Marvelmind system contains less error than Eliko Kio and Qorvo systems, being, in this environment and study scenario, the most accurate and precise module.
The blue dots, in Figure 15, refer to the Marvelmind indoor localization system. They are the closest to the point of origin ( 0 , 0 ) , presented by the symbol ∗, alluding to ground truth values, thus assuming that it is the best-tested indoor localization system. The + symbols, in red, refer to the Eliko KIO localization system, and the x symbols refer to the Qorvo localization system.
A more detailed and illustrative analysis is presented in Figure 16, illustrating more intuitively the difference, in terms of distance allusive to ground truth values, of the three used systems at each of the vertices.
Table 16 presents the best indoor localization system, in each vertex, which is highlighted with the respective color, according to the image caption in Figure 16. The blue color, alluding to Marvelmind, is the most repeated throughout the table, confirming it as the best system of the three selected.
However, it is always necessary to take into account that there are always areas of the map where the different systems have difficulty in having precision, and even accuracy, in locating the AMR, leading to the so-called outliers. These critical points have to do with the positioning or distribution of the different modules, the various location systems, as well as the proximity of the AMR to industrial machinery and surrounding structures of the scenario itself, consisting mostly of iron, eventually influencing the signal propagation and the radio wave reflections.
In summary, there are several approaches to indoor localization for autonomous mobile robots and other different mobile platforms, each with its own set of advantages and disadvantages. The best approach for a given application will depend on the specific requirements and constraints of the environment and the accuracy can vary depending on the specific technologies and infrastructure used. After this analysis, it is possible to conclude that the Marvelmind system is the most accurate and the one that can cover a larger working area with the least number of tags. However, it is the most expensive system of all presented. As far as our case study is concerned, that is, for real-time detection of mobile platforms, such as AMRs, forklifts, or even logistics trains in industrial environments, any of the three systems will work, because they all have satisfactory results with errors below half a meter, which will always allow for safe, accurate, and optimized path planning for all AMRs.

6. Conclusions

This article tests three industrial indoor localization solutions: Qorvo, Eliko Kio, and Marvelmind, supported by two indoor localization methods: ultra-wideband (Qorvo and Eliko Kio) and Uultrasound (Marvelmind). A multicomparison between these three different indoor localization systems and a robot localization system (ground truth) was proposed. To optimize the data obtained by each system, the data acquired by the AMR location system were previously transformed to the position of each of the tags integrated on top of the AMR. Finally, an approximation was used, through MATLAB, using the method of least squares, of the points obtained by each localization system to the respective. It was possible to conclude that the Marvelmind system is the most accurate, but, for our proposal, any of the other systems could be used, because they all have errors below half a meter, which will always allow for safe, accurate, and optimized path planning for all AMRs when planning the paths for the different robots, taking into account the position of the different mobile platforms on the factory floor that are not managed by the robot fleet manager (forklifts, logistics trains).
A seminal contribution of this work lies in its comprehensive examination of three industrial localization systems in real time, coupled with a sophisticated and in-depth analysis. By systematically comparing and contrasting these disparate systems, this study endeavors to unearth nuanced insights into their respective functionalities. The intricate examination of real-time industrial localization not only elucidates the dynamic landscape of these technologies but also underscores their practical implications and potential advancements. This analytical approach offers a multifaceted perspective, fostering a deeper understanding of the intricate interplay between diverse industrial localization systems and providing a foundation for informed decision making in the realm of contemporary technological applications. Various quantitative and qualitative results of the different systems were presented, which could help readers make a future choice when purchasing an indoor localization system on the market.
As future work, it will be interesting to validate one of these indoor localization systems integrated into different forklifts or logistic trains and interaction with the TEA* Algorithm, the AMRs path planning algorithm, in real time and in a real environment.

Author Contributions

The contributions of the authors of this work are pointed as follows: Conceptualization: P.M.R. and J.L.; methodology: P.M.R., J.L. and H.S.; software: P.M.R.; validation: P.M.R., J.L., H.S. and P.C.; writing—review and editing: P.M.R., J.L., S.P.S., P.M.O., H.S. and P.C.; supervision: P.M.R., J.L., S.P.S., P.M.O. and P.C. All authors have read and agreed to the published version of the manuscript.

Funding

This work is co-financed by Component 5-Capitalization and Business Innovation, integrated in the Resilience Dimension of the Recovery and Resilience Plan within the scope of the Recovery and Resilience Mechanism (MRR) of the European Union (EU), framed in the Next Generation EU, for the period 2021–2026, within project Produtech_R3, with reference 60.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

This study did not report any new data.

Acknowledgments

The authors of this work would like to thank the members of INESC TEC for all the support rendered to this project.

Conflicts of Interest

The authors declare no conflicts of interest.

Abbreviations

The following abbreviations are used in this manuscript:
AGVAutonomous Guided Vehicle
AMRAutonomous Mobile Robots
AoAAngle of Arrival
APIApplication Programming Interface
BLEBluetooth Low Energy
EKFExtended Kalman Filter
EUEuropean Union
GPSGlobal Positioning System
IDIdentification Number
IMUsInertial Measurement Units
IoTInternet of Things
IRInfrared
LSLeast Squares
MDPIMultidisciplinary Digital Publishing Institute
MLMachine Learning
MPMobile Platform
RFRadio Frequency
RFIDRadio Frequency IDentification
RSSReceived Signal Strength
RSSIReceived Signal Strength Indicator
RVIZRobot Operating System Visualization
TDoATime Difference of Arrival
TEA*Time Enhanced A*
3DThree-Dimensional
ToATime of Arrival
ToFTime of Flight
2DTwo-Dimensional
UHFUltra High Frequency
USUltrasound
UWBUltra-Wideband
WLANWireless Local Area Network

References

  1. Moura, P.; Costa, P.; Lima, J.; Costa, P. A temporal optimization applied to time enhanced A. In AIP Conference Proceedings; AIP Publishing LLC: College Park, MD, USA, 2019; Volume 2116, p. 220007. [Google Scholar]
  2. Santos, J.; Costa, P.; Rocha, L.F.; Moreira, A.P.; Veiga, G. Time enhanced A*: Towards the development of a new approach for Multi-Robot Coordination. In Proceedings of the 2015 IEEE International Conference on Industrial Technology (ICIT), Seville, Spain, 17–19 March 2015; pp. 3314–3319. [Google Scholar]
  3. Cardarelli, E.; Digani, V.; Sabattini, L.; Secchi, C.; Fantuzzi, C. Cooperative cloud robotics architecture for the coordination of multi-AGV systems in industrial warehouses. Mechatronics 2017, 45, 1–13. [Google Scholar] [CrossRef]
  4. Butdee, S.; Suebsomran, A. Localization based on matching location of AGV. In Proceedings of the 24th International Manufacturing Conference, IMC24. Waterford Institute of Technology, Waterford, Ireland, 20–30 August 2007; pp. 1121–1128. [Google Scholar]
  5. Roy, P.; Chowdhury, C. A survey of machine learning techniques for indoor localization and navigation systems. J. Intell. Robot. Syst. 2021, 101, 63. [Google Scholar] [CrossRef]
  6. Zafari, F.; Gkelias, A.; Leung, K.K. A survey of indoor localization systems and technologies. IEEE Commun. Surv. Tutor. 2019, 21, 2568–2599. [Google Scholar] [CrossRef]
  7. Bradley, C.; El-Tawab, S.; Heydari, M.H. Security analysis of an IoT system used for indoor localization in healthcare facilities. In Proceedings of the 2018 Systems and Information Engineering Design Symposium (SIEDS), Charlottesville, VA, USA, 27 April 2018; pp. 147–152. [Google Scholar]
  8. Shit, R.C.; Sharma, S.; Yelamarthi, K.; Puthal, D. AI-enabled fingerprinting and crowdsource-based vehicle localization for resilient and safe transportation systems. IEEE Trans. Intell. Transp. Syst. 2021, 22, 4660–4669. [Google Scholar] [CrossRef]
  9. Obeidat, H.; Shuaieb, W.; Obeidat, O.; Abd-Alhameed, R. A review of indoor localization techniques and wireless technologies. Wirel. Pers. Commun. 2021, 119, 289–327. [Google Scholar] [CrossRef]
  10. Pilati, F.; Sbaragli, A.; Nardello, M.; Santoro, L.; Fontanelli, D.; Brunelli, D. Indoor positioning systems to prevent the COVID19 transmission in manufacturing environments. Procedia Cirp 2022, 107, 1588–1593. [Google Scholar] [CrossRef]
  11. Xiong, R.; van Waasen, S.; Rheinlnder, C.; Wehn, N. Development of a Novel Indoor Positioning System With mm-Range Precision Based on RF Sensors Network. IEEE Sens. Lett. 2017, 1, 5500504. [Google Scholar] [CrossRef]
  12. Li, N.; Becerik-Gerber, B. An infrastructure-free indoor localization framework to support building emergency response operations. In Proceedings of the 19th EG-ICE International Workshop on Intelligent Computing in Engineering, Munich, Germany, 4–6 July 2012. [Google Scholar]
  13. Wang, S.; Zhao, L. Optimization of Goods Location Numbering and Storage and Retrieval Sequence in Automated Warehouse. In Proceedings of the 2009 International Joint Conference on Computational Sciences and Optimization, Sanya, China, 24–26 April 2009; Volume 2, pp. 883–886. [Google Scholar] [CrossRef]
  14. Lipka, M.; Sippel, E.; Hehn, M.; Adametz, J.; Vossiek, M.; Dobrev, Y.; Gulden, P. Wireless 3D Localization Concept for Industrial Automation Based on a Bearings Only Extended Kalman Filter. In Proceedings of the 2018 Asia-Pacific Microwave Conference (APMC), Kyoto, Japan, 6–9 November 2018; pp. 821–823. [Google Scholar] [CrossRef]
  15. Hesslein, N.; Wesselhöft, M.; Hinckeldeyn, J.; Kreutzfeldt, J. Industrial indoor localization: Improvement of logistics processes using location based services. In Advances in Automotive Production Technology–Theory and Application: Stuttgart Conference on Automotive Production (SCAP2020); Springer: Berlin/Heidelberg, Germany, 2021; pp. 460–467. [Google Scholar]
  16. Xu, L.; Shen, X.; Han, T.X.; Du, R.; Shen, Y. An Efficient Relative Localization Method via Geometry-based Coordinate System Selection. In Proceedings of the ICC 2022-IEEE International Conference on Communications, Seoul, Republic of Korea, 16–20 May 2022; pp. 4522–4527. [Google Scholar] [CrossRef]
  17. Luo, Q.; Yang, K.; Yan, X.; Li, J.; Wang, C.; Zhou, Z. An Improved Trilateration Positioning Algorithm with Anchor Node Combination and K-Means Clustering. Sensors 2022, 22, 6085. [Google Scholar] [CrossRef]
  18. Thrun, S.; Burgard, W.; Fox, D. Probabilistic Robotics (Intelligent Robotics and Autonomous Agents); The MIT Press: Cambridge, MA, USA, 2005. [Google Scholar]
  19. Kim, S.H.; Roh, C.W.; Kang, S.C.; Park, M.Y. Outdoor navigation of a mobile robot using differential GPS and curb detection. In Proceedings of the 2007 IEEE International Conference on Robotics and Automation, Rome, Italy, 10–14 April 2007; pp. 3414–3419. [Google Scholar]
  20. Gonzalez, J.; Blanco, J.; Galindo, C.; Ortiz-de Galisteo, A.; Fernández-Madrigal, J.; Moreno, F.; Martinez, J. Combination of UWB and GPS for indoor-outdoor vehicle localization. In Proceedings of the 2007 IEEE International Symposium on Intelligent Signal Processing, Alcala de Henares, Spain, 3–5 October 2007; pp. 1–6. [Google Scholar]
  21. Hahnel, D.; Burgard, W.; Fox, D.; Fishkin, K.; Philipose, M. Mapping and localization with RFID technology. In Proceedings of the IEEE International Conference on Robotics and Automation, 2004. Proceedings. ICRA’04, New Orleans, LA, USA, 26 April–1 May 2004; Volume 1, pp. 1015–1020. [Google Scholar]
  22. Choi, B.S.; Lee, J.W.; Lee, J.J.; Park, K.T. A hierarchical algorithm for indoor mobile robot localization using RFID sensor fusion. IEEE Trans. Ind. Electron. 2011, 58, 2226–2235. [Google Scholar] [CrossRef]
  23. Huh, J.; Chung, W.S.; Nam, S.Y.; Chung, W.K. Mobile robot exploration in indoor environment using topological structure with invisible barcodes. ETRI J. 2007, 29, 189–200. [Google Scholar] [CrossRef]
  24. Lin, G.; Chen, X. A Robot Indoor Position and Orientation Method based on 2D Barcode Landmark. J. Comput. 2011, 6, 1191–1197. [Google Scholar] [CrossRef]
  25. Kobayashi, H. A new proposal for self-localization of mobile robot by self-contained 2d barcode landmark. In Proceedings of the 2012 Proceedings of SICE annual conference (SICE), Akita, Japan, 20–23 August 2012; pp. 2080–2083. [Google Scholar]
  26. Atanasyan, A.; Roßmann, J. Improving Self-Localization Using CNN-based Monocular Landmark Detection and Distance Estimation in Virtual Testbeds. In Tagungsband des 4. Kongresses Montage Handhabung Industrieroboter; Springer: Berlin/Heidelberg, Germany, 2019; pp. 249–258. [Google Scholar]
  27. Kendall, A.; Grimes, M.; Cipolla, R. Posenet: A convolutional network for real-time 6-dof camera relocalization. In Proceedings of the IEEE International Conference on Computer Vision, Santiago, Chile, 7–13 December 2015; pp. 2938–2946. [Google Scholar]
  28. Sadeghi Esfahlani, S.; Sanaei, A.; Ghorabian, M.; Shirvani, H. The Deep Convolutional Neural Network Role in the Autonomous Navigation of Mobile Robots (SROBO). Remote Sens. 2022, 14, 3324. [Google Scholar] [CrossRef]
  29. Szegedy, C.; Liu, W.; Jia, Y.; Sermanet, P.; Reed, S.; Anguelov, D.; Erhan, D.; Vanhoucke, V.; Rabinovich, A. Going deeper with convolutions. In Proceedings of the 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Boston, MA, USA, 7–12 June 2015; pp. 1–9. [Google Scholar] [CrossRef]
  30. Lima, J.; Rocha, C.; Rocha, L.; Costa, P. Data Matrix Based Low Cost Autonomous Detection of Medicine Packages. Appl. Sci. 2022, 12, 9866. [Google Scholar] [CrossRef]
  31. Sharma, P.; Saucan, A.A.; Bucci, D.J.; Varshney, P.K. On Self-Localization and Tracking with an Unknown Number of Targets. In Proceedings of the 2018 52nd Asilomar Conference on Signals, Systems, and Computers, Pacific Grove, CA, USA, 28–31 October 2018; pp. 1735–1739. [Google Scholar] [CrossRef]
  32. Ahmad, U.; Poon, K.; Altayyari, A.M.; Almazrouei, M.R. A Low-cost Localization System for Warehouse Inventory Management. In Proceedings of the 2019 International Conference on Electrical and Computing Technologies and Applications (ICECTA), Ras Al Khaimah, United Arab Emirates, 19–21 November 2019; pp. 1–5. [Google Scholar] [CrossRef]
  33. Halawa, F.; Dauod, H.; Lee, I.G.; Li, Y.; Yoon, S.W.; Chung, S. Introduction of a real time location system to enhance the warehouse safety and operational efficiency. Int. J. Prod. Econ. 2020, 224, 107541. [Google Scholar] [CrossRef]
  34. Coronado, E.; Kiyokawa, T.; Ricardez, G.A.G.; Ramirez-Alpizar, I.G.; Venture, G.; Yamanobe, N. Evaluating quality in human-robot interaction: A systematic search and classification of performance and human-centered factors, measures and metrics towards an industry 5.0. J. Manuf. Syst. 2022, 63, 392–410. [Google Scholar] [CrossRef]
  35. Martinho, R.; Lopes, J.; Jorge, D.; de Oliveira, L.C.; Henriques, C.; Peças, P. IoT Based Automatic Diagnosis for Continuous Improvement. Sustainability 2022, 14, 9687. [Google Scholar] [CrossRef]
  36. Le, D.V.; Havinga, P.J. SoLoc: Self-organizing indoor localization for unstructured and dynamic environments. In Proceedings of the 2017 International Conference on Indoor Positioning and Indoor Navigation (IPIN), Sapporo, Japan, 18–21 September 2017; pp. 1–8. [Google Scholar] [CrossRef]
  37. Flögel, D.; Bhatt, N.P.; Hashemi, E. Infrastructure-Aided Localization and State Estimation for Autonomous Mobile Robots. Robotics 2022, 11, 82. [Google Scholar] [CrossRef]
  38. Alkendi, Y.; Seneviratne, L.; Zweiri, Y. State of the Art in Vision-Based Localization Techniques for Autonomous Navigation Systems. IEEE Access 2021, 9, 76847–76874. [Google Scholar] [CrossRef]
  39. Dias, F.; Schafer, H.; Natal, L.; Cardeira, C. Mobile Robot Localisation for Indoor Environments Based on Ceiling Pattern Recognition. In Proceedings of the 2015 IEEE International Conference on Autonomous Robot Systems and Competitions, Vila Real, Portugal, 8–10 April 2015; pp. 65–70. [Google Scholar] [CrossRef]
  40. Sudin, M.; Abdullah, S.; Nasudin, M. Humanoid Localization on Robocup Field using Corner Intersection and Geometric Distance Estimation. IJIMAI 2019, 5, 50–56. [Google Scholar] [CrossRef]
  41. Kalaitzakis, M.; Cain, B.; Carroll, S.; Ambrosi, A.; Whitehead, C.; Vitzilaios, N. Fiducial markers for pose estimation. J. Intell. Robot. Syst. 2021, 101, 71. [Google Scholar] [CrossRef]
  42. Grilo, A.; Costa, R.; Figueiras, P.; Gonçalves, R.J. Analysis of AGV indoor tracking supported by IMU sensors in intra-logistics process in automotive industry. In Proceedings of the 2021 IEEE International Conference on Engineering, Technology and Innovation (ICE/ITMC), Cardiff, UK, 21–23 June 2021; pp. 1–7. [Google Scholar] [CrossRef]
  43. Malyavej, V.; Kumkeaw, W.; Aorpimai, M. Indoor robot localization by RSSI/IMU sensor fusion. In Proceedings of the 2013 10th International Conference on Electrical Engineering/Electronics, Computer, Telecommunications and Information Technology, Krabi, Thailand, 15–17 May 2013; pp. 1–6. [Google Scholar] [CrossRef]
  44. Xia, Z.; Chen, C. A Localization Scheme with Mobile Beacon for Wireless Sensor Networks. In Proceedings of the 2006 6th International Conference on ITS Telecommunications, Chengdu, China, 21–23 June 2006; pp. 1017–1020. [Google Scholar] [CrossRef]
  45. Zhao, C.; Wang, B. A UWB/Bluetooth Fusion Algorithm for Indoor Localization. In Proceedings of the 2019 Chinese Control Conference (CCC), Guangzhou, China, 27–30 July 2019; pp. 4142–4146. [Google Scholar] [CrossRef]
  46. Álvarez Merino, C.S.; Luo-Chen, H.Q.; Khatib, E.J.; Barco, R. WiFi FTM, UWB and Cellular-Based Radio Fusion for Indoor Positioning. Sensors 2021, 21, 7020. [Google Scholar] [CrossRef] [PubMed]
  47. Zhang, L.; Wu, X.; Gao, R.; Pan, L.; Zhang, Q. A multi-sensor fusion positioning approach for indoor mobile robot using factor graph. Measurement 2023, 216, 112926. [Google Scholar] [CrossRef]
  48. Dargie, W.; Poellabauer, C. Fundamentals of Wireless Sensor Networks: Theory and Practice; John Wiley & Sons: Hoboken, NJ, USA, 2010. [Google Scholar]
  49. Xiong, J.; Jamieson, K. ArrayTrack: A Fine-Grained indoor location system. In Proceedings of the 10th USENIX Symposium on Networked Systems Design and Implementation (NSDI 13), Lombard, IL, USA, 2–5 April 2013; pp. 71–84. [Google Scholar]
  50. Liu, H.; Darabi, H.; Banerjee, P.; Liu, J. Survey of wireless indoor positioning techniques and systems. IEEE Trans. Syst. Man Cybern. Part C 2007, 37, 1067–1080. [Google Scholar] [CrossRef]
  51. Oppermann, I.; Hämäläinen, M.; Iinatti, J. UWB: Theory and Applications; John Wiley & Sons: Hoboken, NJ, USA, 2004. [Google Scholar]
  52. Ijaz, F.; Yang, H.K.; Ahmad, A.W.; Lee, C. Indoor positioning: A review of indoor ultrasonic positioning systems. In Proceedings of the 2013 15th International Conference on Advanced Communications Technology (ICACT), Pyeong Chang, Republic of Korea, 27–30 January 2013; pp. 1146–1150. [Google Scholar]
  53. Qorvo. Qorvo All Around You. 2024. Available online: https://www.qorvo.com/ (accessed on 14 November 2022).
  54. Pozyx. Pozyx. 2024. Available online: https://www.pozyx.io/ (accessed on 8 February 2024).
  55. Eliko. Next-Generation Location Tracking. 2024. Available online: https://eliko.tech/ (accessed on 10 December 2022).
  56. Marvelmind. Marvelmind Robotics. 2023. Available online: https://marvelmind.com/ (accessed on 23 November 2022).
Figure 1. Angle of arrival method, adapted.
Figure 1. Angle of arrival method, adapted.
Sensors 24 02095 g001
Figure 2. Test scene. Image exported from Robot Operating System Visualization (RVIZ).
Figure 2. Test scene. Image exported from Robot Operating System Visualization (RVIZ).
Sensors 24 02095 g002
Figure 3. Qorvo tag.
Figure 3. Qorvo tag.
Sensors 24 02095 g003
Figure 4. Eliko KIO tag.
Figure 4. Eliko KIO tag.
Sensors 24 02095 g004
Figure 5. Marvelmind tag.
Figure 5. Marvelmind tag.
Sensors 24 02095 g005
Figure 6. Beacon example.
Figure 6. Beacon example.
Sensors 24 02095 g006
Figure 7. Sensors distribution in the industrial environment.
Figure 7. Sensors distribution in the industrial environment.
Sensors 24 02095 g007
Figure 8. Mobile platform sensors integration—system’s architecture.
Figure 8. Mobile platform sensors integration—system’s architecture.
Sensors 24 02095 g008
Figure 9. Marvelmind 2D Points Correspondence.
Figure 9. Marvelmind 2D Points Correspondence.
Sensors 24 02095 g009
Figure 10. Eliko Kio 2D Points Correspondence.
Figure 10. Eliko Kio 2D Points Correspondence.
Sensors 24 02095 g010
Figure 11. Qorvo 2D points correspondence.
Figure 11. Qorvo 2D points correspondence.
Sensors 24 02095 g011
Figure 12. Marvelmind 2D points aligned.
Figure 12. Marvelmind 2D points aligned.
Sensors 24 02095 g012
Figure 13. Eliko Kio 2D points aligned.
Figure 13. Eliko Kio 2D points aligned.
Sensors 24 02095 g013
Figure 14. Qorvo 2D points aligned.
Figure 14. Qorvo 2D points aligned.
Sensors 24 02095 g014
Figure 15. Two-dimensional error points comparison.
Figure 15. Two-dimensional error points comparison.
Sensors 24 02095 g015
Figure 16. Two-dimensional converted points comparison.
Figure 16. Two-dimensional converted points comparison.
Sensors 24 02095 g016
Table 1. Device comparison—some features.
Table 1. Device comparison—some features.
Localization SystemPrecisionEase of DeploymentPower ConsumptionScalabilityEnvironmental Considerations
Qorvo±10 cmyeslow power sleep mode: 15 μ Aeasy−40 °C…+85 °C
Eliko Kio±15 cmyes155 mA in Rx mode 95 mA in Tx modeeasy−20…+55 °C (USB powered device)
Marvelmind±2 cmyes900–1000 mAh 3.6 Veasy−40 °C…+50 °C
Table 2. Ground truth selection process.
Table 2. Ground truth selection process.
Vertex ID5610
PoseXYThetaXYThetaXYTheta
Map1.9736.334−3.140−0.0176.3473.1310.783−4.156−0.025
Robot1.9776.324−3.131−0.0196.3573.1320.786−4.159−0.023
Diff.0.0040.0100.0080.0020.0090.0010.0030.0030.002
Vertex ID121115
PoseXYThetaXYThetaXYTheta
Map−3.548−4.0503.127−1.341−4.1003.120−0.746−2.282−1.566
Robot−3.525−4.0433.127−1.346−4.0953.122−0.763−2.281−1.553
Diff.0.0240.0070.0010.0050.0050.0020.0170.0010.014
Vertex ID26169
PoseXYThetaXYThetaXYTheta
Map−0.6720.048−1.566−0.4452.076−1.566−1.885−1.155−1.545
Robot−0.7060.047−1.565−0.4142.074−1.542−1.903−1.153−1.534
Diff.0.0330.0010.0010.0310.0020.0240.0190.0010.011
Vertex ID331819
PoseXYThetaXYThetaXYTheta
Map−2.5047.572−1.566−2.6054.198−1.566−2.7171.734−1.566
Robot−2.4827.573−1.556−2.5864.204−1.564−2.7021.732−1.551
Diff.0.0210.0010.0110.0190.0050.0020.0140.0020.015
Vertex ID141317
PoseXYThetaXYThetaXYTheta
Map−3.011−0.226−1.566−3.597−2.359−1.566−0.2943.7653.131
Robot−2.974−0.227−1.522−3.635−2.363−1.561−0.2953.7213.136
Diff.0.0360.0020.0440.0380.0040.0060.0010.0440.005
Vertex ID843
PoseXYThetaXYThetaXYTheta
Map−1.8792.897−1.566−3.5619.092−0.023−1.5249.0733.120
Robot−1.9002.893−1.553−3.5609.0870.001−1.5249.0763.123
Diff.0.0210.0040.0140.0010.0050.0240.0000.0040.004
Vertex ID2131
PoseXYThetaXYThetaXYTheta
Map0.6749.0223.1203.2578.9483.1140.4388.1063.131
Robot0.6629.0333.1203.2568.9273.1210.4388.076−3.132
Diff.0.0120.0120.0010.0010.0210.0070.0000.0306.263
Vertex ID7
PoseXYTheta
Map−1.8795.051−1.578
Robot−1.8655.053−1.549
Diff.0.0130.0020.030
Table 3. Minimum number of tags per localization system.
Table 3. Minimum number of tags per localization system.
Localization SystemMinimum Tags/Beacons NumberDetection Type
Qorvo5Tags
Eliko Kio4Tags
Marvelmind4Tags
EKF Beacons (AMR)2Beacons
Table 4. Beacons data conversion.
Table 4. Beacons data conversion.
Localization
Systems
Vertex ID5610
Delta XXYXYXY
Marvelmind0.2151.7626.322−0.2346.3591.001−4.164
Qorvo0.161.8176.322−0.1796.3580.946−4.163
Eliko Kio0.4551.5226.319−0.4746.3611.241−4.170
Localization
Systems
Vertex ID121115
Delta XXYXYXY
Marvelmind0.215−3.740−4.040−1.561−4.091−0.759−2.496
Qorvo0.16−3.685−4.040−1.506−4.092−0.760−2.441
Eliko Kio0.455−3.980−4.036−1.801−4.086−0.754−2.736
Localization
Systems
Vertex ID26169
Delta XXYXYXY
Marvelmind0.215−0.704−0.168−0.4081.859−1.895−1.368
Qorvo0.16−0.705−0.113−0.4091.914−1.897−1.313
Eliko Kio0.455−0.703−0.408−0.4011.619−1.887−1.608
Localization
Systems
Vertex ID331819
Delta XXYXYXY
Marvelmind0.215−2.4797.358−2.5843.989−2.6981.517
Qorvo0.16−2.4807.413−2.5854.044−2.6991.572
Eliko Kio0.455−2.4767.118−2.5833.749−2.6931.277
Localization
Systems
Vertex ID141317
Delta XXYXYXY
Marvelmind0.215−2.964−0.442−3.633−2.578−0.5103.722
Qorvo0.16−2.967−0.387−3.634−2.523−0.4553.722
Eliko Kio0.455−2.952−0.682−3.631−2.818−0.7503.724
Localization
Systems
Vertex ID843
Delta XXYXYXY
Marvelmind0.215−1.8962.678−3.3459.087−1.7399.080
Qorvo0.16−1.8972.733−3.4009.087−1.6849.079
Eliko Kio0.455−1.8922.438−3.1059.087−1.9799.085
Localization
Systems
Vertex ID2131
Delta XXYXYXY
Marvelmind0.2150.4479.0383.0418.9310.2238.074
Qorvo0.160.5029.0373.0968.9300.2788.075
Eliko Kio0.4550.2079.0432.8018.936−0.0178.072
Localization
Systems
Vertex ID7
Delta XXY
Marvelmind0.215−1.8614.838
Qorvo0.16−1.8624.893
Eliko Kio0.455−1.8554.598
Table 5. AMR localization system—Beacons data.
Table 5. AMR localization system—Beacons data.
Vertex ID5610
DataXYThetaXYThetaXYTheta
AVG1.9776.324−3.131−0.0196.3573.1320.786−4.159−0.023
Std. Deviation0.0010.0010.0000.0020.0010.0000.0020.0010.000
Max1.9796.327−3.131−0.0146.3583.1320.788−4.157−0.023
Min1.9756.323−3.131−0.0216.3553.1310.782−4.160−0.024
Diff.0.0030.0040.0000.0070.0030.0010.0060.0040.001
Vertex ID121115
DataXYThetaXYThetaXYTheta
AVG−3.525−4.0433.127−1.346−4.0953.122−0.763−2.281−1.553
Std. Deviation0.0010.0010.0000.0010.0000.0000.0010.0020.000
Max−3.522−4.0423.127−1.345−4.0943.122−0.758−2.274−1.553
Min−3.526−4.0443.126−1.349−4.0953.122−0.764−2.283−1.553
Diff.0.0040.0020.0020.0040.0010.0000.0060.0090.001
Vertex ID26169
DataXYThetaXYThetaXYTheta
AVG−0.7060.047−1.565−0.4142.074−1.542−1.903−1.153−1.534
Std. Deviation0.0010.0010.0000.0040.0010.0000.0020.0010.000
Max−0.7030.050−1.565−0.4092.074−1.541−1.899−1.148−1.534
Min−0.7060.045−1.565−0.4252.072−1.544−1.905−1.155−1.535
Diff.0.0030.0050.0010.0170.0030.0020.0060.0070.001
Vertex ID331819
DataXYThetaXYThetaXYTheta
AVG−2.4827.573−1.556−2.5864.204−1.564−2.7021.732−1.551
Std. Deviation0.0030.0020.0000.0030.0020.0000.0010.0020.000
Max−2.4797.575−1.555−2.5824.206−1.563−2.7001.734−1.551
Min−2.4907.565−1.557−2.5944.196−1.564−2.7061.725−1.552
Diff.0.0110.0100.0010.0120.0100.0010.0050.0090.001
Vertex ID141317
DataXYThetaXYThetaXYTheta
AVG−2.974−0.227−1.522−3.635−2.363−1.561−0.2953.7213.136
Std. Deviation0.0020.0010.0000.0020.0030.0010.0020.0000.000
Max−2.973−0.226−1.522−3.630−2.352−1.560−0.2933.7213.136
Min−2.979−0.231−1.523−3.638−2.366−1.564−0.3013.7203.136
Diff.0.0070.0050.0000.0080.0140.0040.0080.0020.000
Vertex ID843
DataXYThetaXYThetaXYTheta
AVG−1.9002.893−1.553−3.5609.0870.001−1.5249.0763.123
Std. Deviation0.0010.0020.0000.0010.0040.0000.0010.0010.000
Max−1.8972.898−1.552−3.5579.0910.001−1.5209.0783.124
Min−1.9022.891−1.553−3.5619.0750.001−1.5259.0753.123
Diff.0.0050.0070.0000.0030.0160.0010.0050.0020.000
Vertex ID2131
DataXYThetaXYThetaXYTheta
AVG0.6629.0333.1203.2568.9273.1210.4388.076−3.132
Std. Deviation0.0030.0010.0000.0020.0050.0010.0020.0010.000
Max0.6729.0353.1213.2588.9393.1220.4408.081−3.131
Min0.6599.0303.1203.2518.9213.1200.4328.075−3.132
Diff.0.0130.0040.0010.0080.0170.0020.0080.0060.001
Vertex ID7
DataXYTheta
AVG−1.8655.053−1.549
Std. Deviation0.0010.0010.000
Max−1.8645.056−1.548
Min−1.8665.052−1.549
Diff.0.0030.0040.000
Table 6. Indoor localization system—Marvelmind data.
Table 6. Indoor localization system—Marvelmind data.
Vertex ID5610
DataXYZXYZXYZ
AVG10.698−3.5112.54011.381−1.7102.5721.1250.2132.609
Std. Deviation0.0040.0050.0140.0160.0060.0180.0230.0680.020
Max10.712−3.5062.54811.471−1.7022.6671.1600.3102.690
Min10.697−3.5202.53811.360−1.7442.5471.0900.1402.580
Diff.0.0150.0140.010.1110.0420.1200.0700.1700.110
Vertex ID121115
DataXYZXYZXYZ
AVG2.9554.9932.6592.1572.8572.6433.0971.4862.543
Std. Deviation0.0040.0100.0020.0010.0020.0010.0020.0050.004
Max2.9685.0192.6632.1602.8632.6463.1031.5162.548
Min2.9484.9802.6542.1552.8522.6413.0931.4812.525
Diff.0.0200.0390.0090.0050.0110.0050.010.0350.023
Vertex ID26169
DataXYZXYZXYZ
AVG5.1200.7042.6057.255−0.0982.4724.4922.1702.515
Std. Deviation0.0030.0010.0050.0280.0620.0170.0130.0060.014
Max5.1200.7062.6127.268−0.0862.4754.5432.1922.581
Min5.1170.7032.6027.202−0.1082.4694.4462.1592.493
Diff.0.0030.0030.010.0660.0220.0060.0970.0330.088
Vertex ID331819
DataXYZXYZXYZ
AVG12.744−0.0352.5809.6521.0552.4507.4081.9702.421
Std. Deviation0.0040.0030.0040.0010.0180.0100.0050.0170.011
Max12.757−0.0312.5919.6541.1322.4667.4442.0662.454
Min12.739−0.0422.5759.6491.0352.4057.4051.9602.353
Diff.0.0180.0110.0160.0050.0970.0610.0390.1060.101
Vertex ID141317
DataXYZXYZXYZ
AVG5.7022.9122.5243.9274.2112.6189.010−0.5682.412
Std. Deviation0.0050.0130.0030.0030.0040.0020.0150.0070.035
Max5.7182.9512.5303.9384.2242.6259.057−0.5612.518
Min5.6882.8772.5153.9194.2042.6119.001−0.6012.382
Diff.0.030.0740.0150.0190.020.0140.0560.040.136
Vertex ID843
DataXYZXYZXYZ
AVG8.2230.8272.40214.735−0.1322.77714.394−1.1922.767
Std. Deviation0.0010.0020.0040.0010.0040.0010.0200.0480.011
Max8.2260.8322.42514.735−0.1322.77714.504−1.1642.848
Min8.2160.8212.39314.733−0.1432.77514.364−1.5282.760
Diff.0.010.0110.0320.0020.0110.0020.140.3640.088
Vertex ID2131
DataXYZXYZXYZ
AVG13.660−3.1942.73912.736−5.5782.80512.801−2.7272.685
Std. Deviation0.0170.0160.0030.0450.0170.0040.0280.0110.017
Max13.676−3.1832.75212.873−5.5632.81012.826−2.6832.698
Min13.565−3.2842.73412.696−5.6312.79212.678−2.7402.615
Diff.0.1110.1010.0180.1770.0680.0180.1480.0570.083
Vertex ID7
DataXYZ
AVG10.2120.2222.437
Std. Deviation0.0010.0020.002
Max10.2130.2282.442
Min10.2100.2172.433
Diff.0.0030.0110.009
Table 7. Indoor localization system—Eliko Kio data.
Table 7. Indoor localization system—Eliko Kio data.
Vertex ID5610
DataXYXYXY
AVG10.743−3.54711.569−1.6640.805−0.273
Std. Deviation0.0360.0250.0280.0420.0490.014
Max10.820−3.50011.620−1.5900.870−0.240
Min10.630−3.63011.530−1.7400.630−0.300
Diff.0.190.130.090.150.240.06
Vertex ID121115
DataXYXYXY
AVG2.8595.3351.9252.7762.5571.239
Std. Deviation0.0160.0100.0130.0280.0150.021
Max2.8805.3601.9602.8602.5901.270
Min2.8205.3101.8902.7402.5201.170
Diff.0.060.050.070.120.070.1
Vertex ID26169
DataXYXYXY
AVG4.8420.4316.591−0.6013.9751.965
Std. Deviation0.0080.0100.0090.0070.0080.014
Max4.8600.4606.620−0.5903.9901.990
Min4.8200.4106.570−0.6103.9601.930
Diff.0.040.050.050.020.030.06
Vertex ID331819
DataXYXYXY
AVG12.696−0.2959.4861.0017.1612.003
Std. Deviation0.0110.0130.0120.0130.0110.054
Max12.720−0.2509.5101.0207.1802.120
Min12.660−0.3209.4600.9707.1501.950
Diff.0.060.070.050.050.030.17
Vertex ID141317
DataXYXYXY
AVG5.3442.6823.6934.3829.054−0.480
Std. Deviation0.0060.0300.0130.0470.0100.014
Max5.3502.7303.7204.4509.070−0.460
Min5.3302.6203.6704.3009.030−0.520
Diff.0.020.110.050.150.040.06
Vertex ID843
DataXYXYXY
AVG7.9570.76615.037−0.42314.773−1.055
Std. Deviation0.0120.0190.0920.0390.0330.055
Max7.9900.84015.070−0.21014.840−0.890
Min7.9400.74014.500−0.47014.730−1.210
Diff.0.050.10.570.260.110.32
Vertex ID2131
DataXYXYXY
AVG14.165−2.81512.957−5.61013.078−2.688
Std. Deviation0.0050.0090.0250.0130.0350.041
Max14.170−2.80013.000−5.59013.150−2.610
Min14.160−2.83012.910−5.64013.010−2.770
Diff.0.010.030.090.050.140.16
Vertex ID7
DataXY
AVG9.863−0.221
Std. Deviation0.0130.011
Max9.890−0.190
Min9.820−0.240
Diff.0.070.05
Table 8. Indoor localization system—Qorvo Data.
Table 8. Indoor localization system—Qorvo Data.
Vertex ID5610
DataXYXYXY
AVG10.269−2.81811.070−1.3710.567−0.165
Std. Deviation0.2350.1170.1530.1240.1130.144
Max10.897−2.38311.741−1.4920.7420.326
Min10.025−3.25310.934−1.2540.341−0.474
Diff.0.8720.870.8070.2380.4010.8
Vertex ID121115
DataXYXYXY
AVG2.8244.6622.2142.9243.2851.605
Std. Deviation0.0360.0930.0140.0250.0920.050
Max2.9785.2322.2522.9923.5031.705
Min2.7534.7642.1792.8573.1401.284
Diff.0.2250.4680.0730.1350.3630.421
Vertex ID26169
DataXYXYXY
AVG5.4530.8817.195−0.0204.6762.359
Std. Deviation0.0160.0340.0180.0320.0810.048
Max5.5191.0837.3060.1824.7822.42
Min5.3850.7877.148−0.2664.5532.224
Diff.0.1340.2960.1580.4480.2290.196
Vertex ID331819
DataXYXYXY
AVG12.197−0.0029.6071.2946.7622.596
Std. Deviation0.1210.1480.0910.0590.0520.088
Max12.3950.2859.8521.3856.8542.734
Min11.941−0.1969.3411.2136.5512.346
Diff.0.4540.4810.5110.1720.3030.388
Vertex ID141317
DataXYXYXY
AVG5.5443.4754.1944.2758.555−0.187
Std. Deviation0.1030.0950.0960.0820.0450.023
Max5.6483.5464.3414.4518.594−0.146
Min5.3843.3544.0234.1238.503−0.321
Diff.0.2640.1920.3180.3280.0910.175
Vertex ID843
DataXYXYXY
AVG7.8581.06014.537−0.12913.874−1.662
Std. Deviation0.0120.0190.0420.0780.0250.031
Max7.9211.08614.7950.23514.234−1.587
Min7.5361.04214.203−0.42113.678−1.753
Diff.0.3850.0440.5920.6560.5560.166
Vertex ID2131
DataXYXYXY
AVG12.966−3.82112.457−4.91712.579−2.395
Std. Deviation0.0340.0170.0450.0610.0360.054
Max13.029−3.75412.789−4.86312.754−2.152
Min12.753−4.2412.124−5.51212.452−2.421
Diff.0.2760.4860.6650.6490.3020.269
Vertex ID7
DataXY
AVG10.1640.122
Std. Deviation0.0280.036
Max10.2510.156
Min10.0310.063
Diff.0.220.093
Table 9. Indoor localization system—Marvelmind new points.
Table 9. Indoor localization system—Marvelmind new points.
Vertex ID5610
DataXYXYXY
New Point1.5736.317−0.3526.3861.115−3.945
Vertex ID121115
DataXYXYXY
New Point−3.999−3.744−1.720−3.815−0.723−2.485
Vertex ID26169
DataXYXYXY
New Point−0.631−0.318−0.5561.961−1.818−1.383
Vertex ID331819
DataXYXYXY
New Point−2.3757.140−2.4163.862−2.5631.443
Vertex ID141317
DataXYXYXY
New Point−2.909−0.475−3.570−2.573−0.6733.774
Vertex ID843
DataXYXYXY
New Point−1.7422.582−2.9229.057−1.8089.074
Vertex ID2131
DataXYXYXY
New Point0.3239.0212.8788.9100.1568.057
Vertex ID7
DataXY
New Point−1.8074.660
Table 10. Marvelmind localization system—errors.
Table 10. Marvelmind localization system—errors.
Vertex ID5610
DataXYXYXY
Diff.−0.189−0.005−0.1180.0270.1140.219
Vertex ID121115
DataXYXYXY
Diff.−0.2590.296−0.1590.2760.0360.011
Vertex ID26169
DataXYXYXY
Diff.0.073−0.150−0.1480.1020.077−0.015
Vertex ID331819
DataXYXYXY
Diff.0.104−0.2180.168−0.1270.135−0.074
Vertex ID141317
DataXYXYXY
Diff.0.055−0.0330.0630.005−0.1630.052
Vertex ID843
DataXYXYXY
Diff.0.154−0.0970.423−0.030−0.069−0.006
Vertex ID2131
DataXYXYXY
Diff.−0.124−0.017−0.163−0.021−0.067−0.017
Vertex ID7
DataXY
Diff.0.054−0.179
Table 11. Indoor localization system—Eliko Kio new points.
Table 11. Indoor localization system—Eliko Kio new points.
Vertex ID5610
DataXYXYXY
New Point1.4556.283−0.5916.4931.379−4.180
Vertex ID121115
DataXYXYXY
New Point−4.589−3.941−1.867−4.047−0.597−2.975
Vertex ID26169
DataXYXYXY
New Point−0.527−0.553−0.0801.428−1.722−1.847
Vertex ID331819
DataXYXYXY
New Point−2.2397.147−2.4913.694−2.7331.174
Vertex ID141317
DataXYXYXY
New Point−2.823−0.764−3.937−2.856−0.9493.736
Vertex ID843
DataXYXYXY
New Point−1.7992.311−2.8349.415−2.1519.357
Vertex ID2131
DataXYXYXY
New Point−0.2909.3172.7419.022−0.0788.243
Vertex ID7
DataXY
New Point−1.4434.427
Table 12. Eliko Kio localization system—errors.
Table 12. Eliko Kio localization system—errors.
Vertex ID5610
DataXYXYXY
Diff.−0.068−0.036−0.1170.1320.138−0.010
Vertex ID121115
DataXYXYXY
Diff.−0.6090.095−0.0660.0390.157−0.239
Vertex ID26169
DataXYXYXY
Diff.0.176−0.1450.322−0.1910.165−0.239
Vertex ID331819
DataXYXYXY
Diff.0.2370.0290.092−0.055−0.040−0.103
Vertex ID141317
DataXYXYXY
Diff.0.129−0.082−0.306−0.038−0.1990.012
Vertex ID843
DataXYXYXY
Diff.0.093−0.1280.2710.328−0.1720.272
Vertex ID2131
DataXYXYXY
Diff.−0.4970.274−0.0600.086−0.0610.171
Vertex ID7
DataXY
Diff.0.412−0.171
Table 13. Indoor localization system—Qorvo new points.
Table 13. Indoor localization system—Qorvo new points.
Vertex ID5610
DataXYXYXY
New Point1.0995.979−0.5316.2591.793−4.055
Vertex ID121115
DataXYXYXY
New Point−3.508−3.516−1.666−3.518−0.774−2.072
Vertex ID26169
DataXYXYXY
New Point−0.8050.213−0.5292.155−1.945−1.008
Vertex ID331819
DataXYXYXY
New Point−2.1956.871−2.5653.999−2.8560.884
Vertex ID141317
DataXYXYXY
New Point−3.284−0.556−3.595−2.095−0.8203.494
Vertex ID843
DataXYXYXY
New Point−1.7672.425−2.8479.122−1.1819.002
Vertex ID2131
DataXYXYXY
New Point1.1578.8562.3598.737−0.0628.021
Vertex ID7
DataXY
New Point−1.6424.911
Table 14. Qorvo localization system—errors.
Table 14. Qorvo localization system—errors.
Vertex ID5610
DataXYXYXY
Diff.−0.718−0.343−0.352−0.1000.8470.108
Vertex ID121115
DataXYXYXY
Diff.0.1770.525−0.1600.574−0.0140.369
Vertex ID26169
DataXYXYXY
Diff.−0.1000.326−0.1200.241−0.0480.305
Vertex ID331819
DataXYXYXY
Diff.0.285−0.5420.020−0.045−0.157−0.689
Vertex ID141317
DataXYXYXY
Diff.−0.317−0.1690.0400.429−0.365−0.228
Vertex ID843
DataXYXYXY
Diff.0.130−0.3080.5530.0350.503−0.077
Vertex ID2131
DataXYXYXY
Diff.0.655−0.181−0.737−0.193−0.340−0.054
Vertex ID7
DataXY
Diff.0.2200.018
Table 15. Indoor localization systems—error points comparison.
Table 15. Indoor localization systems—error points comparison.
Vertex ID5610
Localization
Systems
XYXYXY
Marvelmind−0.189−0.005−0.1180.0270.1140.219
Qorvo−0.718−0.343−0.352−0.1000.8470.108
Eliko Kio−0.068−0.036−0.1170.1320.138−0.010
Vertex ID121115
Localization
Systems
XYXYXY
Marvelmind−0.2590.296−0.1590.2760.0360.011
Qorvo0.1770.525−0.1600.574−0.0140.369
Eliko Kio−0.6090.095−0.0660.0390.157−0.239
Vertex ID26169
Localization
Systems
XYXYXY
Marvelmind0.073−0.150−0.1480.1020.077−0.015
Qorvo−0.1000.326−0.1200.241−0.0480.305
Eliko Kio0.176−0.1450.322−0.1910.165−0.239
Vertex ID331819
Localization
Systems
XYXYXY
Marvelmind0.104−0.2180.168−0.1270.135−0.074
Qorvo0.285−0.5420.020−0.045−0.157−0.689
Eliko Kio0.2370.0290.092−0.055−0.040−0.103
Vertex ID141317
Localization
Systems
XYXYXY
Marvelmind0.055−0.0330.0630.005−0.1630.052
Qorvo−0.317−0.1690.0400.429−0.365−0.228
Eliko Kio0.129−0.082−0.306−0.038−0.1990.012
Vertex ID843
Localization
Systems
XYXYXY
Marvelmind0.154−0.0970.423−0.030−0.069−0.006
Qorvo0.130−0.3080.5530.0350.503−0.077
Eliko Kio0.093−0.1280.2710.328−0.1720.272
Vertex ID2131
Localization
Systems
XYXYXY
Marvelmind−0.124−0.017−0.163−0.021−0.067−0.017
Qorvo0.655−0.181−0.737−0.193−0.340−0.054
Eliko Kio−0.4970.274−0.0600.086−0.0610.171
Vertex ID7
Localization
Systems
XY
Marvelmind0.054−0.179
Qorvo0.2200.018
Eliko Kio0.412−0.171
Table 16. Euclidean distances to ground truth System.
Table 16. Euclidean distances to ground truth System.
Vertex ID5610
Localization
Systems
XYXYXY
Marvelmind0.1890.1210.247
Qorvo0.7960.3660.854
Eliko Kio0.0770.1760.138
Vertex ID121115
Localization
Systems
XYXYXY
Marvelmind0.3930.3180.038
Qorvo0.5540.5960.369
Eliko Kio0.6160.0770.286
Vertex ID26169
Localization
Systems
XYXYXY
Marvelmind0.1670.1800.078
Qorvo0.3410.2690.309
Eliko Kio0.2280.3740.290
Vertex ID331819
Localization
Systems
XYXYXY
Marvelmind0.2420.2110.154
Qorvo0.6120.0490.707
Eliko Kio0.2390.1070.110
Vertex ID141317
Localization
Systems
XYXYXY
Marvelmind0.0640.0630.171
Qorvo0.3590.4310.430
Eliko Kio0.1530.3080.199
Vertex ID843
Localization
Systems
XYXYXY
Marvelmind0.1820.4240.069
Qorvo0.3340.5540.509
Eliko Kio0.1580.4250.322
Vertex ID2131
Localization
Systems
XYXYXY
Marvelmind0.1250.1640.069
Qorvo0.6790.7620.344
Eliko Kio0.5670.1050.182
Vertex ID7
Localization
Systems
XY
Marvelmind0.187
Qorvo0.221
Eliko Kio0.446
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Rebelo, P.M.; Lima, J.; Soares, S.P.; Moura Oliveira, P.; Sobreira, H.; Costa, P. A Performance Comparison between Different Industrial Real-Time Indoor Localization Systems for Mobile Platforms. Sensors 2024, 24, 2095. https://doi.org/10.3390/s24072095

AMA Style

Rebelo PM, Lima J, Soares SP, Moura Oliveira P, Sobreira H, Costa P. A Performance Comparison between Different Industrial Real-Time Indoor Localization Systems for Mobile Platforms. Sensors. 2024; 24(7):2095. https://doi.org/10.3390/s24072095

Chicago/Turabian Style

Rebelo, Paulo M., José Lima, Salviano Pinto Soares, Paulo Moura Oliveira, Héber Sobreira, and Pedro Costa. 2024. "A Performance Comparison between Different Industrial Real-Time Indoor Localization Systems for Mobile Platforms" Sensors 24, no. 7: 2095. https://doi.org/10.3390/s24072095

APA Style

Rebelo, P. M., Lima, J., Soares, S. P., Moura Oliveira, P., Sobreira, H., & Costa, P. (2024). A Performance Comparison between Different Industrial Real-Time Indoor Localization Systems for Mobile Platforms. Sensors, 24(7), 2095. https://doi.org/10.3390/s24072095

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop