gamma rays to visible light photons. A photomultiplier am-
plifies the flashes in the crystal to voltage levels, which are
supplied to an electronic device to convert the analog signal to
spectral information and finally transferred to a ground station
computer. Several processing techniques are also proposed for
gathering measures and mapping the radiation effect.
A ranger fixed-wing unmanned aerial vehicle was designed
in Kurvinen
et al.
(2005) and equipped with three different
radiation detector types to locate plumes with different activi-
ties and also to avoid saturations in some detectors, namely: (a)
a dose rate meter, GM; (b) a cintillation detector, NaI(Tl), and (c)
a compound semiconductor detector, CdZnTe. These sensors to-
gether with a visual camera were conveniently encapsulated and
installed on an
UAV
. Pöllänena
et al.
(2009) using a commercial
CsI detector in the area of (137) Cs and (131) I, which were able
to detect radioactive particles during the flies on board
UAVs
.
MacFarlane
et al.
(2014) developed a new instrument to
provide rapid and high spatial resolution assessment of radionu-
clide contamination. The full system consists of an unmanned
hexa-copter equipped with a gamma ray spectrometer, a micro-
controller,
GPS
, and lidar. The goal is to rapidly and remotely
detect ground-based radiation anomalies with a high spatial
resolution. Source samples used within this study were speci-
mens collected from the Cornubian batholith, Southwest UK.
Magnetic Sensors
A high-resolution 3-axis magnetic sensor has been mounted
on a helicopter to generate detailed magnetic maps and to
identify various ferrous objects in the soil in the work of Eck
and Imbach (2011). The specifications of the helicopter are:
main rotor diameter of 3.20 m, payload weight of approxi-
mately 30 kg including 10 l.
Multisensor Technologies
Although generally speaking,
UAVs
are equipped with various
sensors for both navigation and detection. Here we address
multisensor technologies from the point of view of remote
sensing applications providing insights.
MAVIS
(Massive Airspace Volume Instrumentation System)
project addresses specifically the design of multisensory tech-
nologies (Sobester, 2011 and 2014).
Vierling
et al.
(2006) designed a multisensory system on-
board a tethered balloon with the following equipment: a dual
channel spectro-radiometer with wavelengths from 350 mn -
1050 nm, an
RGB
micro-video camera, a thermal infrared sensor
sensitive to the spectral range of 7.6 μm to 18 μm, a
GPS
receiver,
tilt sensors, an analog compass sensor, a wireless video trans-
mission device, meteorological sensors for measuring relative
humidity, temperature, barometric pressure, and wind speed.
Some areas of application have been identified, including
canopy vegetation analysis, atmospheric data collection, trace
gas flux measurements or aquatic remote sensing among other.
Martínez-de-Dios
et al.
(2007) used a fleet of three
UAVs
in cooperation for fire detection. One
UAV
(Helivision-
GRVC
)
is equipped with infrared and visual cameras; Figure 4b
displays details of this system. The second
UAV
(Marvin)
is equipped with an ultraviolet flame detector, provided
by Hamamatsu. This device is based on the photoelectric
effect of metal and gas multiplication and operates on the
wavelengths range of 185 nm - 260 nm. Finally, the third
UAV
(Karma) contains a stereoscopic system with two visual
cameras for 3
D
mapping.
Multisensory systems for disaster monitoring and man-
agement are designed in Choi
et al.
(2009) and Choi and Lee
(2011). In the latter, a rotary-wing
UAV
is equipped with two
digital cameras (470 g and 115 g), a laser scanner (7 kg),
GPS
(75 g),
IMU
(3.4 kg), and a communication system, based on a
RF link with a ground control station. A computer, a gimbal,
and a power supply are elements on-board for data process-
ing, compression, and transmission.
Zhou and Reichle (2010) proposed a mathematical model
for multi-sensor data integration consisting of video stream,
GPS
, a three-axis magneto-inductive magnetometer, and a
high-performance two-axis tilt sensor (inclinometer), for both
photogrammetry tasks and navigation.
The use of Smartphones can be considered as multisen-
sory devices. They have been proposed for photogrammetry
applications onboard
UAVs
(Yun
et al.
, 2012; Kim
et al.
, 2013).
Smart phones can operate in broad areas under 3G telecom-
munication networks, they are equipped with MEMS sensors,
including accelerometer, magnetometer, gyroscope, and
GPS
,
which allows image acquisitions with the required informa-
tion to generate photogrammetric products.
With the aim of obtaining direct georeferenced images,
captured from an octo-copter (1.5 kg of payload and 4.8 kg
total weight), Rehak
et al.
(2013) integrated a consumer-grade
digital camera, a geodetic-grade
RTK
-
GPS
/Glonass/Galileo
multi-frequency receiver at 10
Hz
sampling frequency, and four
MEMS
-
IMU
chips with a Field Programmable Gate Array (
FPGA
).
Data fusion is required for combining the information
provided by different sensors. With such purpose, Jutzi
et al.
(2014) proposed a method that weights the data captured with
both a visual camera and a lightweight line laser scanner for
3
D
mapping production.
A multisensory system has been integrated onboard a
quadrotor in Roldán
et al.
(2015) to measure temperature,
humidity, luminosity, and CO2 concentration in a green-
house. The integration has been carried out in the Raspberry
Pi device because its performance.
UAVs in Collaboration, Coordination, and Cooperation
Collaboration, coordination, and cooperation are relevant
concepts when several
UAVs
are programmed to achieve a
remote sensing goal. Some years ago, Ollero and Maza (2007)
noted that a multiple
UAV
-based approach increases the
spatial coverage, improves the reliability due to redundancy,
allows the simultaneous intervention in different places, and
makes possible the teaming of specialized
UAVs
. Coopera-
tion of
UAVs
for different tasks has recently received special
attention. Each vehicle has assigned a portion of the goal and
all collaborate to achieve the global goal with the maximum
performance and accuracy as possible. This is an added value
for these concepts in remote sensing, where technologies and
research must be united for effective actuations (Chao and
Chen, 2012). Obviously, the systems included in this section
are designed for specific missions and applications. Because
of the special characteristics, they have been included in this
section, although some of them appear later in the Applica-
tions Section, under specific applications. Table 2 displays
different strategies applied under collaboration in coordina-
tion and cooperation for remote sensing missions and perfor-
mance. Different applications, where cooperation becomes
efficient are also reported.
Distributed systems architecture and control-based strate-
gies are critical in
UAVs
formations, networking, and collabo-
rations for effective performance (Maza
et al.
, 2010; Richert
and Cortés, 2013). Maza
et al.
(2011) proposed a multi-
UAV
distributed architecture where each vehicle is in charge of a
task or set of tasks. This architecture has been validated in
several applications, including: surveillance, wireless sensor
distribution, and fire detection and extinguishing.
In most cases, early intervention is crucial, particularly in
emergencies; cooperation and collaboration can be the solution.
Cooperative control strategies in
UAVs
for detection and track-
ing are assigned to each vehicle an area, element, or specific
task. This approach is described in Pack
et al.
(2009) for track-
ing ground mobile units emitting intermittently radio frequen-
cy signals. Tracking means that they must fly in coordination to
PHOTOGRAMMETRIC ENGINEERING & REMOTE SENSING
April 2015
291