PE&RS October 2016 Public - page 757

PHOTOGRAMMETRIC ENGINEERING & REMOTE SENSING
October 2016
757
SECTOR
INSIGHT:
.
com
E
ducation
and
P
rofessional
D
evelopment
in
the
G
eospatial
I
nformation
S
cience
and
T
echnology
C
ommunity
O
ur firm bought another aerial camera the other
day. It cost $1,500; weighs as much as an iPad;
and uses a 12 megapixel CMOS sensor to produce
4K imagery. Its tiny, plastic lens is similar to the
one on your cellphone camera. It flies itself!
Eight years ago we purchased two large-format CCD cameras
for $2.4M. They each weigh as much as a small woman. The
glass, compound lens alone weighs 50 pounds. These cameras
don’t fly themselves, must be strapped to a $200,000 manned
aircraft and piloted by a certified pilot.
We will produce amazing map products with both systems.
In less than a decade we have seen computing, sensing, and
mapping technologies produce these astounding mapping
systems that will literally enable anyone to send sensors into
the skies and make maps. The drone mapping revolution
has cut loose flying for non-specialists. Anyone can perform
remote sensing andmapping that previously was possible only
using specialized knowledge and gear. Considered together,
this explosion of remote sensing and mapping technology will
ignite an explosion of new and beneficial applications. Most
have yet been discovered. Millions of people will benefit.
But could there be any difference in the quality and accuracy
of map products produced from these two camera systems?
The answer to that question is: “It depends on the application.”
Common to all remote sensing systems and mapping appli-
cations is the age-old question: “Where’s the ground?” Much
of everything that follows the remote sensing of things is de-
termining where the ground is. The sciences of photogram-
metry and geodesy have been used to find ground since the
By Mike Tully, MS, MBA, GISP
Where’s the Ground?
first cameras sprouted wings. Although drones are still in
their infancy, they already possess incredible power and so-
phistication to fly, take pictures, and map the earth. Bundled
mapping software is equally sophisticated. It seems the skill
and knowledge of photogrammetrists has been reduced to an
algorithm and written to a chip. This enables non-specialists
to acquire imagery and make orthophotography and 3D mod-
els. But one of the chief values of the “photogrammetrist” was
understanding the underlying principles of remote sensing,
surveying, and mapping and properly applying them in a giv-
en application with a specific tool to produce the desirable
product. Determining the exact position of the ground is chief
among these skills.
But can these applications tell you where the ground is? How
confident are you of the position of things in your orthos and
3D models? How is the accuracy of ground position measured?
Finding ground is a big problem with drone systems.
The typical software bundled with drones enables “single-
button” orthorectification of imagery. This is achieved by
first calculating from the pixels in each overlapping image
a digital surface model (DSM). That is, a position (XY) and
elevation (Z) is calculated for each pixel and assigned to a
point floating in space. A 3D “point cloud” is then assembled
using every pixel. The aerial imagery is then mathematically
“draped” over this DSM to create a 3D model of all features
pictured. But where’s the ground?
The ground may or may not be visible in the imagery. We
have a picture taken from over things and can only see
the tops (and maybe some sides) of things. If the ground is
obscured by features like vegetation and buildings, it is not
pictured. Hence, the DSM will model the “surface” of things
NOT the ground. That’s why it is called a digital “surface”
model as opposed to a digital “elevation” model (DEM). A
It seems the skill
and knowledge of
photogrammetrists has been
reduced to an algorithm and
written to a chip.
Photogrammetric Engineering & Remote Sensing
Vol. 82, No. 10, October 2016, pp. 757–758.
0099-1112/16/757–758
© 2016 American Society for Photogrammetry
and Remote Sensing
doi: 10.14358/PERS.82.10.757
739...,747,748,749,750,751,752,753,754,755,756 758,759,760,761,762,763,764,765,766,767,...822
Powered by FlippingBook