Skip to main content
Electrical & Electronics

Edge-Based Sensor Fusion


Edge-Based Sensor Fusion focuses on perception technology for cognitive understanding, localisation and mapping of a real environment. Multiple sensory inputs such as visual, Light Detection and Ranging (LIDAR), and positioning are processed on the edge device to determine a mobile robot’s relative position and how it perceives a mapped environment.

Edge-Based Sensor Fusion provides a sensor fusion platform utilising multiple sensory inputs such as 3D LIDAR, depth cameras, Inertial Measurement Unit (IMU), and Global Navigation Satellite System (GNSS). It provides simultaneous localisation and mapping of a real environment using LIDAR, visual cue for positioning, and target/obstacle detection for navigation by a mobile robot. A small form factor edge-based device is integrated for data processing.


Indoor and Outdoor Environment Site Mapping: Manufacturing Plant, Plantation, Campus, Construction Siteslding Integrity


Edge-Based Sensor Fusion provides the following features

  • Sensor Fusion for Multi-Sensory Inputs
    Multiple sensory inputs are acquired and fused in platform that supports modern sensors such as mono and stereo camera, LIDAR, IMU and GNSS.

  • Data Fusion Utilising LIDAR and Camera for Mapping
    Positioning is enabled through LIDAR-based simultaneous localisation and mapping and visual-based simultaneous localisation and mapping.

  • Target/Obstacle Detection
    Post-processing of the mapping is offered for further target or obstacle detection in the navigation system.

Edge-Based Sensor Fusion for Reliable Environment Perception

Technology Benefits

The main impacts of Edge-Based Sensor Fusion are:

  • Accurate Positioning Indoors and Outdoors
    The platform can be used indoors and outdoors for accurate positioning and visual cognition based on multi-sensory inputs.

  • Flexible Support of Sensors
    Various sensors supported such as 3D LIDAR models ranging from low channel resolution to high resolution.