Multi-UAV Surveillance over Forested Regions
Vengatesan Govindaraju, Gerard Leng, and Zhang Qian
Abstract
S-UAVs
(Small-Unmanned Aerial Vehicles) have emerged
as low-cost alternatives for aerial surveillance over forests.
However, they provide limited coverage owing to their low
altitudes and short endurance. Therefore, a quick and effec-
tive surveillance necessitates optimal flying paths, maximiz-
ing ground visibility. Even though the occlusion of ground
points due to vegetation is significant in forests, it is gener-
ally neglected. This paper proposes a probabilistic sensing
model that incorporates both occlusions due to terrain and
vegetation, in the visibility computations and presents a
two-step approach to determine near-optimal flight paths:
(a) waypoints are strategically deployed to enhance visibility,
using centroidal Voronoi tessellation, and (b) flyable paths
are designed using a clustered spiral-alternating algorithm.
Simulation studies conducted on synthetic terrains and a
reconstructed terrain, from satellite data of tree-cover and
a Digital Elevation Model (
DEM
), show the effectiveness of
the proposed method in improving the terrain visibility
as compared to commonly used grid-based waypoints.
Introduction
The recent advancements, in artificial intelligence and
mechatronics, have boosted the application of autonomous
Small Unmanned Aerial Vehicles (
S-UAVs
) to relieve humans
from performing some dull, dirty, and dangerous tasks.
S-UAVs
are a class of unmanned aerial vehicles that are small enough
for easy field deployment/recovery and transport (Watts
et
al.
2012). They were used predominantly in the military for
surveillance and reconnaissance purposes. There has been a
growing interest in the use of
S-UAVs
in civilian applications
as well, due to their low operational costs and their inher-
ent ability to reach areas that are inaccessible by humans or
ground robots. For example, forests, due to its highly ran-
dom and dense obstructions, become an important operating
domain for
S-UAVs
, for a wide variety of applications such as
environmental monitoring, wildlife tracking, anti-poaching
operations, monitoring forest fires, and search-and-rescue
(Nonami
et al.
2010). Surveillance of forests with existing
methods, such as remote sensing using either satellites or
manned aircrafts, do not provide the desired spatial and tem-
poral resolution and are also not cost-effective. The autono-
mous
S-UAVs
can provide real-time, high-resolution images of
the forested regions and do not need skilled operators (Koh
and Wich, 2012). However, they are generally electric pow-
ered with a short endurance of 0.5 to ~2 hours, have low fly-
ing speeds, and have low flying altitudes (<200 m) (Gundlach,
2012). These drawbacks imply that
S-UAVs
can cover only a
limited area, in a single flight time. Therefore, an effective
surveillance over large forest areas maximizes coverage by
optimizing the imaging path for the
S-UAV
.
In order to optimize the imaging path for effective sur-
veillance, the factors affecting the coverage must be studied.
The term “coverage” here, means the Line-of-Sight visibility
of the ground points as seen from the
S-UAV
’s sensor. Since
S-UAVs
cannot support heavy payload, lightweight electro-op-
tical sensors, such as visible or near-infra-red cameras, are
commonly used. These sensors have a limited field of view
and rely only on the direct line-of-sight visibility with the
target to record the observation. In a forested environment,
the field of view of these on-board sensors can get occluded in
the presence of two types of obstructions: “Complete obstruc-
tions” due to terrain features and “partial obstructions” due
to vegetation. Current methods for visibility computations
consider only the complete obstructions such as terrain undu-
lations or buildings. They either completely omit or overes-
timate the partial obstruction due to vegetation, to avoid the
difficulty of incorporating discrete and stochastic variables in
the visibility computations (Bartie
et al.
2011). However, in a
forested environment, vegetation is the major obstruction that
cannot be neglected in the visibility computations, especial-
ly for low-flying
S-UAV
. This paper presents a probabilistic
sensor model that incorporates both the complete and partial
obstructions in the visibility computations from the
UAV
to
maximize the ground visibility.
To improve the effectiveness of the surveillance, a near-op-
timal imaging path is designed in two steps: First, the way-
points (which are also the observation points), are strategically
deployed using the method of centroidal Voronoi tessellations;
second, a flyable path along the waypoints, which obeys the
kinematic constraints of the
S-UAV
, is designed based on the
improved Clustered Spiral-Alternating algorithm. Simula-
tions are done on both the synthetically generated terrain and
reconstructed terrain from actual satellite data. The simula-
tions help to compare the resulting visibility of the proposed
approach with the commonly used grid-based distribution of
waypoints. This paper is organized as follows: The literature
relevant to this field of research is addressed followed by
a description of the proposed approaches, such as the visi-
bility decay, the probabilistic sensor model, the waypoints
determination using centroidal Voronoi tessellation and the
path-planning using the Clustered Spiral-Alternating algo-
rithm. The next Section presents the simulation results and
compares the visibility values of the proposed approach with
the visibility values of the grid-based approach, followed by
the conclusions and some suggested future work to be done in
this area.
Relevant Literature
The problem of finding a shortest path, that also satisfies some
visibility requirements, has been well studied in the field of
computational geometry in the name of “Watchman Routing
Mechanical Engineering Department, National University
of Singapore, Block E1, 1 Engineering Drive 2, #02-01, Dy-
namics Lab, Singapore 117576 (
.
Photogrammetric Engineering & Remote Sensing
Vol. 80, No. 12, December 2014, pp. 1129–1137.
0099-1112/14/8012–1129
© 2014 American Society for Photogrammetry
and Remote Sensing
PHOTOGRAMMETRIC ENGINEERING & REMOTE SENSING
December 2014
1129