Research update from DTU - When sensors work together
In January, researchers from DTU submitted their latest results in the ARLI project to the European Space Agency (ESA).
The work focuses on sensor fusion – combining different types of sensors to achieve more accurate and robust navigation in urban environments.
LiDAR scan of Rosenkrantzgade in Aarhus. The original dataset is monochrome.
Content
January 2026
Research update from DTU: When sensors work together
In January, researchers from DTU submitted their latest results in the ARLI project to the European Space Agency (ESA). The work focuses on sensor fusion – combining different types of sensors to achieve more accurate and robust navigation in urban environments.
The results clearly show that the robot cannot navigate reliably using GNSS (satellite positioning) alone. In dense urban areas such as Fiskergade, Rosenkrantzgade, and Selmersvej, satellite signals are significantly affected. This causes the reported position to “jump” and temporarily deviate from the robot’s actual location.
Data from Rosenkrantzgade: This figure shows variations in GNSS positioning in a dense urban street with tall buildings. In reality, the robot drove steadily along the pavement, but the satellite positions “jump” due to signal disturbances.
When GPS is not enough
Test data illustrates how GNSS positions drift away from the pavement, even though the robot was driving steadily. When GNSS is combined with an IMU (Inertial Measurement Unit) – a motion sensor measuring acceleration and rotation – the position becomes significantly more stable.
The IMU forms part of an INS (Inertial Navigation System), which continuously calculates the robot’s position and direction based on motion data. This allows the robot to maintain its course during periods of weak or disturbed satellite signals.
Together, GNSS and INS provide much more robust navigation.
Data from Selmersvej: The yellow dots show the position based on GNSS alone. The red line shows the position when GNSS is combined with an IMU/INS – a motion-based navigation system that measures the robot’s acceleration and rotation. The figure illustrates a solution in which multiple sensors work together.
Camera and LiDAR improve precision
In Rosenkrantzgade, researchers have also demonstrated how an upward-facing camera can help identify and filter out unreliable GNSS measurements, further improving positioning accuracy.
Data from Rosenkrantzgade: Using an upward-facing camera, the system can identify GNSS signals affected by reflections from buildings (so-called NLOS – Non Line Of Sight). These measurements can be filtered out, resulting in more accurate positioning.
Another key development is LiDAR odometry, where the robot estimates its movement directly from its own 3D scans. In principle, this allows navigation without GNSS. Test results show that the robot follows its route convincingly, although minor deviations from reference measurements remain.
Data from Rosenkrantzgade: This figure demonstrates LiDAR odometry – a method where the robot estimates its movement by comparing successive 3D scans of the surroundings. The KISS algorithm (Keep It Simple and Straightforward) provides an efficient computational approach for this purpose. The method enables route tracking without GNSS. Minor deviations from the reference measurements are visible, but the route is reproduced convincingly.
Multiple sensors – reliable navigation
In the final phase of the project, DTU is working to combine GNSS, INS, and LiDAR in a more advanced sensor fusion framework. The goal is a system capable of stable navigation in dense urban areas, under bridges, and in locations with intermittent GPS coverage.
If autonomous robots are to operate safely in public urban spaces, precise and reliable positioning is essential. The preliminary results demonstrate that combining multiple sensors is key to achieving this.
Key sensors in ARLI
GNSS (Global Navigation Satellite System)
Satellite-based positioning (e.g. GPS). Provides global coordinates but can be inaccurate in dense urban areas or under bridges.
IMU (Inertial Measurement Unit)
A motion sensor that measures acceleration and rotation.
INS (Inertial Navigation System)
A navigation system using IMU data to continuously calculate position and direction – even when satellite signals are unstable.
LiDAR (Light Detection and Ranging)
Laser-based 3D scanning that maps surroundings as a point cloud. Used for both modelling and navigation.
Sensor fusion
The combination of data from multiple sensors to achieve more accurate and robust positioning than any single sensor alone.