PE&RS June 2018 Full - page 357

Finding Timestamp Offsets for a Multi-Sensor
System Using Sensor Observations*
Raphael Voges, Christian S. Wieghardt, and Bernardo Wagner
Abstract
Multi-sensor systems are widely used for robotics applica-
tions. While additional sensors can increase the accuracy
and robustness of the solution, it is inevitable to synchronize
them in order to rely on the results. For our multi-sensor
system consisting of an actuated laser scanner, its motor and
a camera, we assume that the timestamps are only delayed by
a constant offset. We propose two different approaches to cal-
culate timestamp offsets from laser scanner to motor, one of
which is additionally capable of determining the timestamp
offset between laser scanner and camera. Both approaches
use parts of a SLAM algorithm but apply different criteria to
find an appropriate solution. Our experiments show that we
are able to determine timestamp offsets with a reasonable
accuracy. Furthermore, our experiments exhibit the signifi-
cance of a proper synchronization for a multi-sensor system.
Introduction
For many applications a sensor system consisting of a laser
scanner and camera is used to solve the
SLAM
problem (Droe-
schel
et al
., 2014; J. Zhang and Singh, 2015). While in the past
2D laser scanners were sufficient for the navigation of mobile
robots in planar environments, recent
SLAM
approaches deal
with 3D data to avoid obstacles at all heights and simultane-
ously acquire a dense 3D point cloud of the environment
(Nuechter
et al
., 2007; Bosse and Zlot, 2009; Bosse, Zlot, and
Flick, 2012; J. Zhang and Singh, 2014). However, 3D laser
scanners that provide high resolution and long ranges are ex-
pensive. Therefore, cheaper 2D laser scanners that are usually
only capable of acquiring scan points in a plane are actuated
by a servo-drive to gather 3D data (Wulf and Wagner, 2003).
To transform the measurement points into three-dimen-
sional space, it is required to know the appropriate encoder
values of the servo drive for every set of scan points. Because
of that, there exist two different approaches to determine
these encoder values. The first and most simple possibility
is using a motor that stops at discrete steps to let the laser
scanner capture measurement points (Mandow
et al
., 2010).
This solution, however, leads to a lower data rate since the
laser scanner and the motor have to wait for each other before
performing a measurement or a rotation. Another possibil-
ity is to continuously monitor and control the motion of the
motor while acquiring measurement points with a high scan
frequency (Wulf and Wagner, 2003; Yoshida
et al
., 2010).
However, this can lead to a constant offset between the time-
stamps of the laser scanner and the motor due to the latency
and transmission lags of sensors and computers. Therefore,
it is essential to achieve a proper synchronization between
the timestamps of the laser scanner and its rotating motor as
it is already mentioned by Hebert and Krotkov (1992). If no
synchronization is present, the offset for the corresponding
encoder values for each set of scan points can lead to a large
distortion in the resulting point cloud that is constructed by
a
SLAM
(Simultaneous Localization And Mapping) approach
(Wulf and Wagner, 2003).
Furthermore, if the images taken by a camera are fused
with the measurements of the laser scanner, a proper syn-
chronization between these two sensors is indispensable as
well. Otherwise, the point clouds and images that may have
been assigned with the same timestamp do not correspond to
the same actual moment of time, and thus cannot be fused to
generate a consistent representation of the environment.
Thus, our aim is to correct distortion in point clouds ac-
quired by fusing the measurements of a rotating laser scanner
and a camera that arises due to erroneous timestamp offsets
between the three devices, i.e., namely the actuated laser
scanner, its corresponding motor, and the camera. For this
purpose, we assume that the timestamp offsets are constant
throughout our measurement periods.
We present two different approaches to determine the
timestamp offsets. The first, stationary approach can be used
to synchronize the laser scanner and motor before using it
for an online algorithm that requires correctly transformed
3D data. For this, it is necessary to not move the system for a
short period of time and wait for the calibration to finish. The
second, motion-based approach makes it possible to deter-
mine the offset between laser scanner, motor, and camera after
the acquisition of a large dataset. Thereby, it becomes possible
to use the dataset for offline computations although the initial
synchronization of the multi-sensor system is not optimal. To
verify the synchronization results between laser scanner and
motor, we compare the offsets computed by both approaches.
Related work in the field of synchronizing actuated laser
scanners is presented by Morales
et al
. (2011). Within this
work the authors describe the design and development of a
mechanical system that is used to rotate the laser scanner.
Furthermore, a motion controller that is responsible for the
synchronization between the mechanical system and the
laser scanner is presented. The distinction is that our method
solely focuses on the time synchronization, and thus can be
applied to arbitrary motor and laser scanner combinations.
Similar to our approach, Sheehan
et al
. (2010) attempt to
design a 3D laser scanner that is capable of automatic self-
calibration. Their system consists of an arbitrary number of
2D laser scanners that are mounted on a rotating plate. In
addition to other extrinsic parameters they also deal with the
estimation of the clock skews between their devices. For this,
Leibniz University of Hannover - Institute of Systems
Engineering - The Real Time Systems Group
(
).
*This paper is an extended version of work published in
(Voges
et al.
, 2017). We extend our previous work by incorp-
orating a camera as an additional sensor whose timestamp
offset to the laser scanner needs to be determined.
Photogrammetric Engineering & Remote Sensing
Vol. 84, No.6, June 2018, pp. 357–366.
0099-1112/18/357–366
© 2018 American Society for Photogrammetry
and Remote Sensing
doi: 10.14358/PERS.84.6.357
PHOTOGRAMMETRIC ENGINEERING & REMOTE SENSING
June 2018
357
327...,347,348,349,350,351,352,353,354,355,356 358,359,360,361,362,363,364,365,366,367,...406
Powered by FlippingBook