Survey of Datafusion Techniques for Laser and Vision Based Sensor Integration for Autonomous Navigation

dc.contributor.authorKolar, Prasanna
dc.contributor.authorBenavidez, Patrick
dc.contributor.authorJamshidi, Mo
dc.date.accessioned2021-04-19T15:19:55Z
dc.date.available2021-04-19T15:19:55Z
dc.date.issued2020-04-12
dc.date.updated2021-04-19T15:19:55Z
dc.description.abstractThis paper focuses on data fusion, which is fundamental to one of the most important modules in any autonomous system: perception. Over the past decade, there has been a surge in the usage of smart/autonomous mobility systems. Such systems can be used in various areas of life like safe mobility for the disabled, senior citizens, and so on and are dependent on accurate sensor information in order to function optimally. This information may be from a single sensor or a suite of sensors with the same or different modalities. We review various types of sensors, their data, and the need for fusion of the data with each other to output the best data for the task at hand, which in this case is autonomous navigation. In order to obtain such accurate data, we need to have optimal technology to read the sensor data, process the data, eliminate or at least reduce the noise and then use the data for the required tasks. We present a survey of the current data processing techniques that implement data fusion using different sensors like LiDAR that use light scan technology, stereo/depth cameras, Red Green Blue monocular (RGB) and Time-of-flight (TOF) cameras that use optical technology and review the efficiency of using fused data from multiple sensors rather than a single sensor in autonomous navigation tasks like mapping, obstacle detection, and avoidance or localization. This survey will provide sensor information to researchers who intend to accomplish the task of motion control of a robot and detail the use of LiDAR and cameras to accomplish robot navigation.
dc.description.departmentElectrical and Computer Engineering
dc.identifierdoi: 10.3390/s20082180
dc.identifier.citationSensors 20 (8): 2180 (2020)
dc.identifier.urihttps://hdl.handle.net/20.500.12588/489
dc.rightsAttribution 4.0 United States
dc.rights.urihttps://creativecommons.org/licenses/by/4.0/
dc.subjectdatafusion
dc.subjectdata fusion
dc.subjectmultimodal
dc.subjectfusion
dc.subjectinformation fusion
dc.subjectsurvey
dc.subjectreview
dc.subjectRGB
dc.subjectSLAM
dc.subjectlocalization
dc.subjectobstacle detection
dc.subjectobstacle avoidance
dc.subjectnavigation
dc.subjectdeep learning
dc.subjectneural networks
dc.subjectmapping
dc.subjectLiDAR
dc.subjectoptical
dc.subjectvision
dc.subjectstereo vision
dc.subjectautonomous systems
dc.subjectdata integration
dc.subjectdata alignment
dc.subjectrobot
dc.subjectmobile robot
dc.titleSurvey of Datafusion Techniques for Laser and Vision Based Sensor Integration for Autonomous Navigation
dc.typeArticle

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
sensors-20-02180-v2.pdf
Size:
4.56 MB
Format:
Adobe Portable Document Format

License bundle

Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
0 B
Format:
Item-specific license agreed upon to submission
Description: