<?xml version='1.0' encoding='utf-8'?>
<oai_dc:dc xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:oai_dc="http://www.openarchives.org/OAI/2.0/oai_dc/" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://www.openarchives.org/OAI/2.0/oai_dc/ http://www.openarchives.org/OAI/2.0/oai_dc.xsd">
  <dc:contributor>Jonathon Donager</dc:contributor>
  <dc:contributor>Jason L. McVay</dc:contributor>
  <dc:contributor>Joel B. Sankey</dc:contributor>
  <dc:creator>Temuulen T. Sankey</dc:creator>
  <dc:date>2017</dc:date>
  <dc:description>&lt;p&gt;&lt;span&gt;Forest vegetation classification and structure measurements are fundamental steps for planning, monitoring, and evaluating large-scale forest changes including restoration treatments. High spatial and spectral resolution remote sensing data are critically needed to classify vegetation and measure their 3-dimensional (3D) canopy structure at the level of individual species. Here we test high-resolution lidar, hyperspectral, and multispectral data collected from unmanned aerial vehicles (UAV) and demonstrate a lidar-hyperspectral image fusion method in treated and control forests with varying tree density and canopy cover as well as in an ecotone environment to represent a gradient of vegetation and topography in northern Arizona, U.S.A. The fusion performs better (88% overall accuracy) than either data type alone, particularly for species with similar spectral signatures, but different canopy sizes. The lidar data provides estimates of individual tree height (&lt;/span&gt;&lt;i&gt;R&lt;/i&gt;&lt;sup&gt;&lt;i&gt;2&lt;/i&gt;&lt;/sup&gt;&lt;span&gt;&amp;nbsp;&lt;/span&gt;&lt;span&gt;=&lt;/span&gt;&lt;span&gt;&amp;nbsp;&lt;/span&gt;&lt;span&gt;0.90; RMSE&lt;/span&gt;&lt;span&gt;&amp;nbsp;&lt;/span&gt;&lt;span&gt;=&lt;/span&gt;&lt;span&gt;&amp;nbsp;&lt;/span&gt;&lt;span&gt;2.3&lt;/span&gt;&lt;span&gt;&amp;nbsp;&lt;/span&gt;&lt;span&gt;m) and crown diameter (&lt;/span&gt;&lt;i&gt;R&lt;/i&gt;&lt;sup&gt;&lt;i&gt;2&lt;/i&gt;&lt;/sup&gt;&lt;span&gt;&amp;nbsp;&lt;/span&gt;&lt;span&gt;=&lt;/span&gt;&lt;span&gt;&amp;nbsp;&lt;/span&gt;&lt;span&gt;0.72; RMSE&lt;/span&gt;&lt;span&gt;&amp;nbsp;&lt;/span&gt;&lt;span&gt;=&lt;/span&gt;&lt;span&gt;&amp;nbsp;&lt;/span&gt;&lt;span&gt;0.71&lt;/span&gt;&lt;span&gt;&amp;nbsp;&lt;/span&gt;&lt;span&gt;m) as well as total tree canopy cover (&lt;/span&gt;&lt;i&gt;R&lt;/i&gt;&lt;sup&gt;&lt;i&gt;2&lt;/i&gt;&lt;/sup&gt;&lt;span&gt;&amp;nbsp;&lt;/span&gt;&lt;span&gt;=&lt;/span&gt;&lt;span&gt;&amp;nbsp;&lt;/span&gt;&lt;span&gt;0.87; RMSE&lt;/span&gt;&lt;span&gt;&amp;nbsp;&lt;/span&gt;&lt;span&gt;=&lt;/span&gt;&lt;span&gt;&amp;nbsp;&lt;/span&gt;&lt;span&gt;9.5%) and tree density (&lt;/span&gt;&lt;i&gt;R&lt;/i&gt;&lt;sup&gt;&lt;i&gt;2&lt;/i&gt;&lt;/sup&gt;&lt;span&gt;&amp;nbsp;&lt;/span&gt;&lt;span&gt;=&lt;/span&gt;&lt;span&gt;&amp;nbsp;&lt;/span&gt;&lt;span&gt;0.77; RMSE&lt;/span&gt;&lt;span&gt;&amp;nbsp;&lt;/span&gt;&lt;span&gt;=&lt;/span&gt;&lt;span&gt;&amp;nbsp;&lt;/span&gt;&lt;span&gt;0.69 trees/cell) in 10&lt;/span&gt;&lt;span&gt;&amp;nbsp;&lt;/span&gt;&lt;span&gt;m cells across thin only, burn only, thin-and-burn, and control treatments, where tree cover and density ranged between 22 and 50% and 1–3.5 trees/cell, respectively. The lidar data also produces highly accurate digital elevation model (DEM) (&lt;/span&gt;&lt;i&gt;R&lt;/i&gt;&lt;sup&gt;&lt;i&gt;2&lt;/i&gt;&lt;/sup&gt;&lt;span&gt;&amp;nbsp;&lt;/span&gt;&lt;span&gt;=&lt;/span&gt;&lt;span&gt;&amp;nbsp;&lt;/span&gt;&lt;span&gt;0.92; RMSE&lt;/span&gt;&lt;span&gt;&amp;nbsp;&lt;/span&gt;&lt;span&gt;=&lt;/span&gt;&lt;span&gt;&amp;nbsp;&lt;/span&gt;&lt;span&gt;0.75&lt;/span&gt;&lt;span&gt;&amp;nbsp;&lt;/span&gt;&lt;span&gt;m). In comparison, 3D data derived from the multispectral data via structure-from-motion produced lower correlations with field-measured variables, especially in dense and structurally complex forests. The lidar, hyperspectral, and multispectral sensors, and the methods demonstrated here can be widely applied across a gradient of vegetation and topography for monitoring landscapes undergoing large-scale changes such as the forests in the southwestern U.S.A.&lt;/span&gt;&lt;/p&gt;</dc:description>
  <dc:format>application/pdf</dc:format>
  <dc:identifier>10.1016/j.rse.2017.04.007</dc:identifier>
  <dc:language>en</dc:language>
  <dc:publisher>Elsevier</dc:publisher>
  <dc:title>UAV lidar and hyperspectral fusion for forest monitoring in the southwestern USA</dc:title>
  <dc:type>article</dc:type>
</oai_dc:dc>