Proceedings of the
April 24-28, 2006
Scientific Investigations Report 2006-5094
P. Lynn Scarlett, Acting
Secretary
P. Patrick Leahy, Acting
Director
Edited by John W. Brakebill, Jennifer B. Sieverling, and Peter G. Chirico
Scientific Investigations Report 2006-5094
For more information about the USGS and its products:
Telephone 1-888-ASK-USGS
World Wide Web http://www/usgs.gov/
Copies of this report can be accessed by URL:
https://pubs.water.usgs.gov/sir2006-5094/
The U.S. Geological Survey (USGS) publishes the articles herein as a service to all interested parties, including the public, but expressly disclaims responsibility for article contents expressed by those not employed by the USGS. The USGS makes no warranty whatsoever with respect to those contents. Furthermore, the content of these articles expresses the views and opinions of the authors and do not necessarily reflect the views and opinions of the USGS and/or its employees.
The use of trade, product, or firm names in this report is for descriptive purposes only and does not imply endorsement by the U.S. Government.
CONTENTS
Participating Organizations. 2
USGS GIS 2006 Planning Committee Members
Table 1. USGS GIS 2006 Schedule, Monday
Table 2. USGS GIS 2006 Schedule, Tuesday
Table 3. USGS GIS 2006 Schedule, Wednesday
Table 4. USGS GIS 2006 Schedule, Thursday
Table 5. USGS GIS 2006 Schedule, Friday
Presentation Titles and Abstracts, Monday, April 24, 2006
Monday Plenary, 8:00 am – 10:15 am
Extraterrestrial GIS at the USGS, By Trent M. Hare
Web GIS Lingua Franca: Open Source Geospatial Visualization, By Patrick Hogan
USGS Hazards Program, By Bill Werkheiser
ERDAS IMAGINE: Fundamentals with ArcGIS Image Analysis, 10:30 am – 3:25 pm, By Joe Mostowy
3-D Volumetric Analysis, 10:30 am – 3:25 pm, By Skip Pack
Hazards 1, 10:30 am – 3:25 pm, moderated By Catherine Costello
DHS FEMA: GIS for Situational Awareness, By Drew Douglas
Programming, Scripting, and Tools, 10:30 am - 3:25 pm, moderated by Roland Viger
GIS Tools for Area-Weighted Transfer: The NAWQA Area-Characterization Toolbox, By Curtis Price
Water Availability Screening Tool, By Scott Hoffman
ArcMap Tool for NWISWeb, By Steven K Predmore
Utilizing Mobile Computing to Inventory Ground-Water Sites, By Steven K Predmore and Tyler Johnson
WebGIS and Data Visualization, 10:30 am – 3:25 pm, moderated by Luke Blair and Jacqueline Fahsholtz
Visualizing spatial data with Google Earth, By Amar Nayegandhi, John C. Brock, and C. Wayne Wright
Solutions to Post-Earthquake Information Response and Visualization, By David Wald
Delivering Scientific Information on the Web Using Google Maps, By Gregory L. Gunther
A Virtual Tour of the 1906 Earthquake, By Luke Blair
Mapping Data Warehouses with a Web Browser, By Nathaniel L Booth and Eric Everman
Realtime Earthquakes in Google Earth, By Scott Haefner
Implementing a database IMS - The Good, the Bad, and the Ugly, By Susan Rhea
Elevation 1 (LIDAR), 10:30 am – 3:25 pm, moderated by Silvia Terziotti
LIDAR 101 Video, By Jason Stoker
Some Challenges in Using LIDAR-Drived Data for Hydrologic Applications, By Silvia Terziotti 20
Deriving vegetation metrics using LIDAR, By Amar Nayegandhi, John C. Brock, and C. Wayne Wright
Monday GIS Town Hall Meeting, 4:00pm – 8:30 pm, moderated by Jennifer B. Sieverling
STS-99, Radar Mapping the Earth in 3-D, By Captain Dominic L. Pudwull Gorie
Establishing a Center of Excellence for GIScience, By Stephen C. Guptill
Presentation Titles and Abstracts for Tuesday, April 25, 2006
Tuesday Plenary, 8:00 am – 11:45 am
Geospatial Line of Business and Geospatial Modernization Blueprint, By Karen Siderelis
Keynote Presentation: What you Need to Know, By Patrick Leahy
Protecting America’s Health using GIS, By Brian Kaplan
Advanced Image Processing, 1:00 pm – 2:55 pm, By Joe Mostowy
Using XTools Pro 3.1, 1:00 pm – 2:55 pm, By Andrei Elobogoev and Viatcheslav Ananev
Introduction to ArcGIS, 1:00 pm – 5:10 pm, By Andres Abeyta
Metadata in the Real World, 1:00 pm – 2:55 pm, By Sharon Shin and Peter Schweitzer
IMAGINE Spatial Modeling and Volumetric Measure Using VirtualGIS, 3:15 – 5:10, By Joe Mostowy
Making Maps with ArcGIS, 3:15 pm – 5:10 pm, By Heather Paskevic
ArcGIS Spatial Analyst 3:15 pm – 5:10 pm, By Steve Kopp
In the News: GIS and Public Health, 1:00 pm – 5:10 pm, moderated by Yvonne Baevsky
Mapping Naturally Occurring Asbestos using Imaging Spectroscopy, By Gregg Swayze, Ph.D.
Overview of Activities Linking USGS-NAWQA Data to Public Health, By Patty Toccalino Ph.D.
Spatial and Temporal Autocorrelation of Emerging Diseases, By F. Lee De Cola
ArcGIS Road Ahead, 1:00 pm – 5:10 pm, By Bart Killpack
CLICK: The New USGS Center for LIDAR Information Coordination & Knowledge, By Jordan Menig
USGS Digital Imagery Quality Assurance Plan, By Gregory Stensaas
Physical Terrain Modeling in a Digital Age, By Lawrence Faulkner
Enriching the Geospatial Web Experience, By Peter N. Schweitzer and Bruce R. Johnson
Guidelines on Releasing Geospatial Data, By Jennifer B. Sieverling and Gregory Allord
National Datasets – 1:00pm – 5:10 pm, moderated by Catherine Costello
CENSUS Data, By Jim Castagneri
National Map Vector Dataset Development, By Paul Wiese
NASA LP DAAC and USGS EROS Data: What We Have and Where to Get It, By Roger Oleson and Jon Walkes
Progress and Status of the Watershed Boundary Dataset (WBD), By Michael T. Laitta and Karen Hasen
Tuesday Poster Session, 5:15 pm – 7:30 pm
Digital Data Atlas of the Fort Cobb Watershed, By Jason Masoner and Seth Tribbey
Presentation Titles and Abstracts for Wednesday, April 26, 2006
Wednesday Plenary, 8:00 am – 11:30 am
State/local Partnerships and the 50 States Initiative, By Gene Trobia
NBII's Geospatial Interoperability Framework: Making Standards Work!, By Donna Roy
Introducing the NHDPlus, By Alan Rea
Image Processing with ENVI, 1:00 pm - 2:55 pm, By Adam O’Connor
Implementing ArcGIS Server, 1:00 pm - 5:10 pm, By John Waterman
Geospatial One-Stop, 1:00 pm - 2:55 pm, By Robert Dollison and Jacque Fahsholtz
ArcPad 7.0, 1:00 pm – 2:55 pm, By Finn Dahl
Hyper-spectral Analysis with ENVI, 3:15 pm - 5:10 pm, By Adam O’Connor
Finding USGS Geospatial Data Online - 3:15 pm - 5:10 pm, By Joseph Kerski and Curtis Price
Geoprocessing in ArcGIS 3:15 – 5:10 pm, By Corey Tucker and Steve Kopp
Modeling with ArcGIS, 1:00 pm – 2:55 pm, By Corey Tucker and Steve Kopp
GIS Partnerships and Education, 1:00 pm – 5:10 pm, moderated by Joseph Kerski
Development of Local Resolution National Hydrography Dataset in North Carolina,
By Christopher Kannan, Silvia Terziotti, Steve Strader, and Chad Wagner
Project Homeland - Colorado Pilot, By Chuck Matthys
Educational Developments in GIScience, By Joseph J. Kerski
National Hydrography Dataset (NHD) Stewardship and Maintenance Program, By Paul Kimsey
Productivity Tools, 1:00 pm – 2:55 pm
Data East's Productivity Tools For GIS, By Andrei Elobogoev and Viatcheslav Ananev
Elevation 2, 1:00 pm – 2:55 pm, moderated by Pete Chirico
New, Weird, Wonderful, and the Kitchen Sink, 3:15 pm – 5:10 pm, moderated by Roland Viger
Open Source Software tools to create web-based GIS solutions, By Rafael Moreno
Filling in the DLG Gap: A Data Thesaurus Experiment, By Barbara P. Buttenfield
Hazards 2, 3:15 pm – 5:10 pm, moderated by Catherine Costello
HAZUS-MH, Multi-Hazard Loss Estimation Tool, By Doug Bausch
MARS: LIDAR Processing Software, 3:15 pm – 5:10 pm, By Bill Emison and Mark Romano
Presentation Titles and Abstracts for Thursday April 27, 2006
Thursday Plenary, 8:00 am – 11:30 am
Land Remote Sensing Program, By Ron Beck
Colorado State University and NSF, By Melinda Laituri
Thursday Special Lunch Session, 11:30 am – 1:00 pm
How Me 'an Teddy Mapped San Juan Hill, By Kenneth J. Lanfear
Feature Extraction from Imagery, 1:00 pm – 2:55 pm, By eCognition
ArcSDE for SQL Server, 1:00 pm – 5:10 pm, By Tom Murray
Geoprocessing in ArcGIS, 1:00 pm – 2:55 pm, By Corey Tucker
Global Positioning Systems (GPS) 101, 1:00 pm – 2:55 pm, By Steve Reiter and Joseph Kerski
Surface Interpolation, 3:15 pm – 5:10 pm, By Steve Lynch
PLTS Data Creation Tools, 3:15 pm – 5:10 pm, By Jonathan Weaver
Fly Through Your Data, 3:15 pm – 5:10 pm, By Tamrat Belayneh
GPS for GIS, 3:15 pm – 5:10 pm, By Steve Reiter and Joseph Kerski
Land and Water Characterization, 1:00 pm – 5:10 pm, moderated by Carma San Juan and Stephen J. Char
Land Cover TRENDS Project Results for the Puget Lowland Ecoregion, By Daniel G. Sorenson
Referencing and Analyzing Stream Gages to the National Hydrography Dataset, By David Buchholz
Ecosystem Mapping, By Roger Sayre
GIS Interoperability and Standards, 1:00 pm – 2:55 pm, By Jeanne Foust
GIS Supporting Decisions, 1:00 pm – 2:55 pm, moderated by Mike Mulligan
USGS/NPS Vegetation Mapping supporting management decisions in the Parks, By Karl Brown
GIS System Design, 1:00 pm – 2:55 pm, By David Peters
ArcGIS Data Interoperability Extension, 3:15 pm – 5:10 pm, By Don Murray
Round Table Discussion, 3:15 pm – 5:10 pm, moderated by Mike Mulligan
Presentation Titles and Abstracts for Friday, April 28, 2006
Introduction to Geostatistical Analyst, 8:00 am – 9:55 am, By Steve Lynch
ARCSDE for Oracle – 8:00 am – 11:55 am, By Tim Clark
Making Maps with ArcGIS – 8:00 am – 9:55 am, By Heather Paskevic
GPS for GIS – 8:00 am – 9:55 am, By Steve Reiter and Joseph Kerski
ArcGIS 3-D Analyst – 10:10 am – 11:55 am, By Steve Kopp and Tamrat Belayneh
Surface Interpolation – 10:10 am – 11:55 am, By Steve Lynch
ArcPad 7 - 10:10 am – 11:55 am, By Finn Dahl
Feature Extraction from Imagery – 1:05 pm – 3:00 pm, By eCognition
ArcGIS Spatial Analyst - 1:05 pm – 3:00 pm, By Steve Kopp
PLTS Data Creation Tools - 1:05 pm – 3:00 pm, By Jonathan Weaver
Fly Through Your Data - 1:05 pm – 3:00 pm, By Tamrat Belayneh
The U.S. Geological Survey’s (USGS) Sixth Biennial
Geographic Information Science Workshop April 24 - 28, 2006, at the
Several prominent speakers are featured at this Workshop. Monday evening Star Guest Speaker and National Aeronautics and Space Administration (NASA) Astronaut Captain Dominic Gorie will talk about his experiences as a veteran of three space flights and over 32 days in space, including the NASA Space Shuttle Radar Topography Mission that mapped more than 47 million miles of the Earth’s land surface. Selected as an astronaut candidate by NASA in December 1994, Captain Gorie is currently Chief of the Astronaut Shuttle Branch. Monday evening also features a town hall meeting with Geographic Information Office (GIO) leaders Karen Siderelis, Kevin Gallagher, Bob Pierce, Steve Guptill, Mark DeMulder, John Mahoney, and Mark Negri, who will discuss changes and activities within the GIO in an open discussion format.
Tuesday plenary sessions feature keynote speaker Dr. P.
Patrick Leahy, Acting USGS Director. Dr. Leahy holds undergraduate and graduate degrees in
geology (1968) and geophysics (1970) from
The purpose of this proceedings volume is to serve as an activity reference for Workshop attendees as well as an archive of technical abstracts submitted, presented, and discussed at the Workshop. Author, co-author, and presenter names, affiliations, and contact information are listed with presentation titles along with submitted abstracts. Some hands-on sessions are offered twice. In these instances, abstracts submitted for publication are presented in the proceedings on both days they are offered. All acronyms used in these proceedings are explained in the text of each abstract. The term “ArcGIS” refers to an integrated collection of GIS software products produced by Environmental Systems Research Institute, Inc. (ESRI).
The Workshop schedule is presented in the table of contents and in tables 1-5. Tables 1-5 contain a complete list of activities and specialty meetings, including the time, building, and room locations of scheduled events. Morning plenary sessions are held Monday through Thursday and focus on changes within the USGS, trends in GIS, extraterrestrial GIS, data visualization, hazards, health, data standards, enhancements to the National Hydrography Dataset (NHDPlus), GIS partnerships, remote sensing, and USGS geospatial liaisons.
Concurrent hands-on and lecture sessions occur each day after the morning plenary sessions. Plenary and lecture sessions will not be held on Friday, however several hands-on sessions are scheduled. Lecture sessions are approximately 25 minutes in length with 5 minutes for discussion. Hands-on sessions are of variable length and cover a variety of topics including but not limited too: the availability and use of national-scope data, GIS system administration and design, web-based GIS data dissemination, metadata generation, geoprocessing, land and water characterization, GIS-integrated Decision Support Systems (DSS), GIS and public health, image processing, new tools, data sharing, cartography, hazards, modeling, and a variety of USGS programs related to geospatial data.
Several additional topical meetings are scheduled during lunch breaks and in the evenings. These meetings include discussions on USGS Geospatial Liaisons, stream statistics and characterization (StreamStats), Enterprise GIS (EGIS), LIght Detection And Ranging (LIDAR), and geoprocessing. A poster session is held on Tuesday evening from 5:15 pm to 7:30 pm, and awards for various categories are presented at the Thursday morning plenary session. Please see the workshop schedule in Table 1 for details of these and other specialty meetings.
Federal Departments and Agencies:
Centers for Disease Control and Prevention (CDC)
National Aeronautics and Space Administration (NASA)
National Oceanic and Atmospheric Administration (NOAA)
Coastal
National
Geophysical
Animal and Plant Health Inspection Service (APHIS): Veterinary Services Centers
for Epidemiology and Animal Health
Fish and Wildlife Service (FWS)
National Park Service (NPS)
Universities:
Commercial:
eCognition
Data East, LLC
Definiens, Inc.
Digital Globe
Dynamic Graphics
Environmental Systems Research Institute, Inc. (ESRI)
GeoEye
GCS Research LLC
IGIS Technologies, Inc.
Leica Geosystems
Merrick & Company
Rockware, Inc.
RSI, Inc.
Safe Software
Sanz Geospatial Solutions Group
Solid Terrain Modeling
SPOT Image Corporartion
We would like to thank the many scientists whose contributions
and accomplishments are reflected in these proceedings, as their efforts ensure
continued success for the USGS. We would like to acknowledge Valerie Gaine,
James Gerhart, Andrew LaMotte, and David Litke, for their review comments, and
Betzaida Reyes for her assistance with the layout of this manuscript. Thanks
are extended to the
Workshop Coordinator:
Jennifer Sieverling
USGS Discipline Coordinators:
Biology: Mike Mulligan
Geography: Steve Helterbrand
Geology: James (Luke) Blair
Geospatial Information Office: Barb Ray
Hydrology: John Brakebill
Topic Specialists:
Yvonne Baevsky (In the News: GIS and Public Health)
Steve Char (Land and Water Characterization)
Pete Chirico (Remote Sensing)
Jacque Fahsholtz (Server Technology)
Catherine Costello (Remote Sensing)
Joseph Kerski (Education)
Bill Oatfield (Systems Support)
Curtis Price (Geoprocessing and Analysis)
Barbara Ray (NSDI Partnerships)
Carma
Silvia Terziotti (LIDAR)
Roland Viger (Programming and Scripting)
Extraterrestrial GIS is simply the application of GIS technologies to study planetary bodies other than the Earth. Since the mid 1990s, the Astrogeology Team of the USGS has been utilizing GIS applications (e.g. ESRI’s (Arc/Info) for planetary data creation and research (Carr 1995; Hare 2003). Like many research teams within the USGS, we have persistently followed the technical advances in GIS and fortunately, the overall experience has been positive. The technical advances made in this field during the last decade are phenomenal. Here I will briefly describe a few avenues the Astrogeology Team has been pursuing regarding this technology.
Sharing extraterrestrial data across multiple GIS applications has proven problematic because defining planetary coordinate reference systems (CRSs) in standardized GIS file formats or specifications, like GeoTIFF and Web Mapping Services (WMS), is not well supported. For a little over a year we have interfaced with the Open Geospatial Consortium (OGC) to help resolve this. Thus far we have researched our options with the help of OGC members and have begun to implement methods to resolve the issues, including solutions for WMS, GeoTIFF, JPEG2000, GML, and other data standards (Hare 2006).
Since 1999, we have hosted the PIGWAD or “Planetary Interactive GIS-on-the-Web Analyzable Database” website for serving various planetary datasets and tools for the Moon, Mars, Venus, Titan and several Jovian satellites. The original solution we used was ArcView Internet Mapping Server (IMS), but we have since switched to ESRI’s newer ArcIMS and are currently testing ArcMap Server as well as openSource WMS solutions (e.g. MapServer). One of the most attractive aspects of the original ArcViewIMS and ArcMap Server is the ability to not only host the data but also host robust GIS functionality. For example, using ArcViewIMS, we previously maintained a site that helped researchers analyze suitable landing sites for the Mars Exploration Rovers. The most recent on-line GIS application we have implemented, using ArcMap Server, includes a set of Mars crater density tools (Hare 2006). We are also working with the Jet Propulsion Laboratories (JPL) on creating a planetary WMS server based on JPL’s OnEarth LandSAT server. This enhanced server will also have powerful Web Coverage Server (WCS) and possibly Web Processing Server (WPS) capabilities (Dobinson 2006).
In conclusion, we will continue to see growing uses for GIS technologies and spatial analysis in extraterrestrial research, mission planning, and mission support tasks.
References:
Carr, M. H., (1995), The Martian drainage system and the origin of networks and fretted channels, Journal of Geophysical Research, 100, pp. 7479-7507
Hare, T., et. al., (2003), GIS 101 for planetary research, ISPRS Working Group IV/9: Extraterrestrial Mapping Workshop, "Advances in Planetary Mapping 2003".
Hare, T., et. al., (2006), Standards Proposal to Support Planetary Coordinate Reference Systems in Open Geospatial Web Services and Geospatial Applications, Lunar Planet Science Conference XXXVII, abs. 1931.
Hare, T., et. al., (2006), Mars Crater Density Tools: Project Report, Lunar Planet Science Conference XXXVII, abs. 2398.
Dobinson, E., et. al., (2006), Adaptation & Use of Open Geospatial© Web Technologies for Multi-Disciplinary Access to Planetary Data, Lunar Planet Science Conference XXXVII, abs. 1463.
The web has made access to geospatial information continuous and dynamic. 3D planetary visualization provides the natural context to make this data very interesting. The freely available terabytes of data necessary to map the basic planetary visualization do not even begin to address the need to intelligently access the remaining petabytes of geospatial information. But the web makes this all possible. The web not only provides the opportunity for information retrieval from an expanding universe of data, it also allows for doing this at the very moment the data arrives. That’s like being able to witness the very edge of an expanding universe at any moment in time. In our case, the expanding universe is composed of geospatial information.
Visualization of geospatial information, that’s the easy part. The hard part is the ‘intelligence’ needed to readily find the desired data (information retrieval) and then quickly analyze it. This requires additional tools; information retrieval tools to acquire the desired data, and analytical tools to manipulate and understand that data.
Consider a Web GIS success story in the making that could save thousands of lives and billions of dollars. How? Simply by having broad-based and immediate access to information that already exists. Consider a tsunami early warning system. One small plug-in application to a broad-based (free) planetary visualization tool, be it from ESRI, NASA, Google or other, could be listening to a server that alarms when circumspect seismographic data are received. This same server can broadcast wave height and speed as transmitted from oceanic buoys. The arrival time and expected wave run-up can be visually delivered (no translation required) to the entire world immediately, thereby optimizing the time needed for response. Here is a case where geospatial information goes directly into the hands of the people who need it the most.
Free is good, but open source may be even better. One way to stimulate the entrepreneurial spirit and engender solution-driven commercial enterprise is with an open source and open standards visualization platform. Those who build the intelligence tools as proprietary plug-ins will now have the broadest market possible! Highly specialized needs can still be met by in-house experts coupled with a software engineer. The graduate student can more effectively challenge the frontiers of science. And anybody who has a need to understand or communicate geospatial data has the wherewithal to do so.
The ArcHydro Tools within ArcGIS provide a convenient, well-integrated means of developing an interactive watershed delineation and characterization environment using digital elevation models (DEM) and other geospatial data. The power and flexibility of the ArcHydro Tools, however, come at the expense of considerable complexity. In this hands-on workshop we will cover how to: 1. Preprocess DEM data for use with the ArcHydro Tools, including techniques for enforcing hydrologic drainage networks and watershed boundaries. 2. Preprocess other data needed by the ArcHydro Tools. 3. Delineate local watersheds and compute built-in watershed characteristics for areas small enough to manage datasets as single units. 4. Modify the ArcHydro configuration to compute other watershed characteristics. 5. Set up a global watershed delineation and characterization environment for areas too large to manage with single datasets.
In addition, we will provide an overview of Geodatabase concepts, and will offer participants the chance to gain hands-on experience using Geodatabase editing tools, topology, and geometric networks. The workshop materials are adapted from a one-week Stream Stats Data Preparation Workshop that was offered for Water Science Centers implementing Stream Stats. For more information on Stream Stats see at URL: http://streamstats.usgs.gov/
RockWorks 2006 is the latest version of an integrated collection of programs that are designed for the management, analysis, and graphical display of geological data. This abbreviated hands-on course will focus on borehole and measured section data including lithology, stratigraphy, geochemistry, geophysics, fractures, and water levels. Particular attention will be devoted to generating striplogs, cross-sections, profiles, fence-diagrams, and block models. Other topics include volumetric calculations, gridding, solid-modeling, and logical operations.
Ecosystem processes in conifer forests are impacted by
fire when living vegetation is consumed and nutrients and cations in soils are
increased by deposition of ash and charred organic matter and by litterfall
from scorched trees. We analyzed high spatial resolution (2.4m pixel
size) Airborne Visible and Infrared Imaging Spectrometer (AVIRIS) data to map
post-fire surface cover, including ash, soil minerals, scorched conifers, and
green vegetation, on the Cerro Grande fire. This fire occurred near
A surface cover map was also made using Landsat TM data and a maximum likelihood, supervised classification. When compared to the AVIRIS map, the Landsat classification map grossly overestimated cover by scorched conifer and ash classes and severely underestimated soil and green vegetation cover. The single, broad Landsat band in the 2 to 2.5 micron region was not sufficient to discriminate between lightly scorched and unaffected conifers nor to detect clay minerals in soils.
In a comparison of AVIRIS surface cover to the Burned Area Emergency Rehabilitation (BAER) map of burn severity, the high burn severity areas did not capture the variable patterns of post-fire surface cover by ash, soil, and scorched conifers seen in the AVIRIS map. The BAER map, derived from air photos, also did not capture the distribution of scorched trees that were observed in the AVIRIS map. The Landsat-derived burn severity map, generated from the differenced Normalized Burn Ratio (NBR) calculation, portrayed more variability in burn severity but had twice as much area classified as moderate severity when compared to the area covered by scorched trees. Burn severity and surface cover images were found to contain complementary information, with NBR presenting an image of the degree of fire’s transformation of pre-fire surface cover and the AVIRIS-derived surface cover showing the end-state of that transformation.
Because a level of seismic hazard exists everywhere, GIS
applications have become important tools for scientific research and
educational outreach related to seismic hazards. Much of the GIS data we
use is generated in-house in the form of current and historic earthquake
catalogs and maps (global coverage), calculated (probabilistic) earthquake
hazard maps for any region or area, a database of geologically active
(Quaternary) faults within the US, and information on earthquake-related
hazards such as landslides. In addition, we utilize a wide array of GIS
products, data, and services available from other institutions and agencies
such as the University of Memphis Center for Earthquake Research and
Information. We are proud to collaborate with other governmental agencies
and educational institutions, for example,
The increased incidence of catastrophic wildfires in the western United States and the encroachment of human development into fire-prone ecosystems have created a critical need for methods to quantify potential hazards posed by debris flows produced from burned watersheds. Debris flows can be one of the most hazardous consequences of rainfall on recently burned hill slopes. Empirical models developed to estimate the probability of post-wildfire debris-flow activity and the magnitude of the response can be quickly implemented on a GIS platform to generate debris-flow hazard maps following wildfires. A model for the probability of debris-flow production from individual drainage basins was developed using logistic regression analyses on a database from 401 basins that were burned by 15 recent fires located throughout the U.S. Intermountain West. The model describes debris-flow probability as a function of readily-obtained measures of area burned extent, soil properties, basin gradient, and rainfall from short-duration convective rainstorms. In addition, a model for estimating the volume of material that may issue from a basin mouth was developed using a series of multiple regression analyses on a database from 56 basins burned by eight fires. The model describes debris-flow volume as a function of the area of the basin gradient, burned extent and storm rainfall. These models are readily implemented in a GIS to produce hazard maps that identify those basins most likely to produce the largest events. The probability and volume maps can be combined using a simple relational algorithm to provide a relative hazard ranking for each burned basin, thus providing critical information for post-fire mitigation decisions and evacuation planning.
Probabilistic tsunami hazard mapping is performed at
Management
The U.S. Geological Survey National Water-Quality Assessment (NAWQA) Program requires the landscape near water sampling sites to be characterized to assist in interpretation of water-quality data. The areas to be characterized may be represented as either simple polygon features (watersheds or aquifer areas) or as buffer polygon features calculated from point feature locations (such as sampled wells or springs). Specific landscape information used to characterize these areas (for example, population or estimated pesticide use) usually is reported as attributes of other polygon features such as county or Census block group boundaries. Thus, the required geographic information system (GIS) analysis involves the area-weighted transfer of attributes from “source” polygon features (county or block group areas) to “target” polygon features (watersheds or well buffers). The GIS processing includes overlay of these data features to develop area weights to estimate values for the target areas.
A set of GIS scripts has been developed to automate the transfer of polygon attributes from one set of polygon features to another. These tools automate the creation of area-weight tables that record the results of the overlay process. These weight tables can then be used to efficiently transfer many attributes (for example, pesticide application data for many compounds recorded by county) from source polygon features to target polygon features using the stored results of a single GIS overlay operation.
The U.S. Geological Survey Central Energy Resources Team (CERT) Oracle database (1) contains oil, gas, and coal related data in support of domestic and international energy resource surveys and analyses; and (2) serves as CERT’s primary spatial database, which is accessed through a variety of desktop and web applications using Environmental Systems Research Institute’s (ESRI) ArcSDE middleware product. The database system was designed and implemented using, in addition to Arc SDE, an Oracle database management system and a RedHat Linux operating system on a Dell Server hardware platform. After prototyping the Oracle, ArcSDE, and RedHat Linux operating system on the Dell Server, the architecture was documented and placed in a production environment. In more than a year of operation, this architecture has proven to be a cost-effective way to deliver enhanced functionality with excellent performance while supporting preexisting applications. The documentation developed as part of the CERT database system provides users with a “jump start” guide for the migration to a Dell-Linux-Oracle-ArcSDE environment. Additional benefits of this migration have been the review and documentation of thousands of ArcSDE layers and the development of performance metrics based on the ArcGIS TOFINO tool set extension.
As part of the Pennsylvania State Water Plan Update,
water availability needs to be assessed in watersheds across
The Tool is based on the ArcHydro data model and is supported by a hydrologically-enforced digital elevation model (DEM) for all drainage basins flowing into the State. ArcObjects were used to develop the Screening Tool and give the cooperator flexibility in generating scenarios and updating the State's water-use database.
The U.S. Geological Survey national water database (NWIS) contains a wealth of ground-water, surface-water, and water-quality data that is available to the public on the national website (NWISWeb). However, querying data from the NWISWeb is difficult when using geographic information systems such as ArcMap. To simplify this task, an ArcMap tool was developed to query and download data from the NWISWeb for a selected map extent. Data are downloaded for the included sites and a temporary shape file is created. The shape file is displayed on the map with a hotlink that connects to the NWISWeb “Site Description” page for each site. This tool also allows the user to select a site and explore ground-water, surface-water, and (or) water-quality data related to the site, through the hotlink to NWISWeb. These data can then be downloaded in a tabular format to use in ArcMap.
With the advent of more sophisticated handheld and tablet type computers, the ability to enter groundwater levels and site information directly into a digital format in the field has become an attractive option. One of the first uses of this new technology was the development of the Multi Optional Network Key Entry System (MONKES). MONKES was initially developed to enter ground-water level data into the USGS national water-level information system (NWIS) from files created on Pocket PC handheld devices. In 2002, MONKES2 was developed to help people canvas and establish new groundwater sites using the Pocket PC. Due to the success of the MONKES project, the California Ground-Water Ambient Monitoring and Assessment (GAMA) Program wanted to take MONKES to the next level (MONKES2). Preliminary work on MONKES2 indicated that the screen on a Pocket PC was too small to enter site location information easily. Computing advances and the advent of Tablet PCs presented a solution to this drawback.
As a result, the Alternate Place Entry (APE) Form was created to canvas and create new ground-water site batch files on the Tablet PC, which has the advantages of a larger screen size, more powerful processors, and more memory in both RAM and hard drive than its Pocket PC predecessor. In addition, the Tablet PC runs Windows XP Tablet edition, giving it the ability to run most Windows XP programs and therefore connect with most peripheries that can be attached to a computer. Despite these advantages, the Tablet PC has some disadvantages. It costs more than the Pocket PC, boots up more slowly, and is larger in weight and size. However, for GAMA’s work, the Tablet PC advantages outweighed its disadvantages, and APE was developed.
The current version of APE allows the site location data to be entered electronically. With the help of Geographic Information System (GIS) technology embedded into the APE program, some of the NWIS fields like county name, USGS topographic map name, and hydrologic unit code can be automatically populated from the latitude and longitude value. The 8-page Ground-Water Site Schedule (form 9-1904-A), the California Ground-Water canvas sheet, site photos, and a site sketch maps can all be printed directly from the APE. Finally, the latitude and longitude of the site can be validated with the help of the embedded GIS, and a batch file created for uploading the new ground-water site into NWIS.
To characterize stream habitat for the National Water-Quality Assessment Program (NAWQA), stream transect data are often collected using survey equipment and recorded as X, Y, Z values and a description of the point in ASCII text format. Transect and reach characterization parameters are then calculated by copying-and-pasting these data into a spreadsheet containing formulas. Since not all reach surveys have the same number of survey points, the spreadsheet layout and formulas must be modified each time, which increases the probability of error. Additionally, parameters calculated in a spreadsheet cannot be easily imported to a database, making information more difficult to retrieve and analyze. Utilizing ArcGIS geoprocessing functionality and the Python scripting language, a program was written to produce a 3-dimensional point feature class of surveyed cross-section data. The program allows the user to translate survey data from ASCII file format to a feature class and assign attributes to the feature class based on the original point description recorded in the field. Display in ArcMap allows the feature class attributes to be verified in a spatial environment. Attribute errors are corrected using basic ArcGIS Editing functions. Transect and reach characterization parameters are then calculated from the edited point feature class using a second Python script and output in a database-compatible format.
High-resolution, geo-rectified imagery in digitized
format is often difficult to visualize without the expertise and availability
of sophisticated GIS software. Google Earth is a new interactive 3D
visualization tool for personal computers that combines satellite imagery and
maps from Google's database. Google Earth software can also be used to
visualize raster imagery and GIS data from other sources. Formatted data,
hosted on a web server, can be shared with a network of end users running the
Google Earth client software. Google Earth’s intuitive interface allows
everyone in the organization to interact meaningfully with technical datasets
without any expensive, time-consuming formal training. We present methods
to ingest layers of GIS data products, such as LIDAR DEM imagery and airborne
photography, into Google Earth. We demonstrate the ability to use Google
Earth with several 100 gigabytes of data acquired from a post- Hurricane
Katrina survey over the Gulf Islands National Seashore in
The rapidly changing landscape of Internet GIS technologies has made it faster, easier, and more cost effective to deliver location-based scientific information over the web than traditional internet mapping technologies. Complicated, multitiered, and “finicky” internet mapping applications are no longer needed in many cases, given the maturation of service-based technologies such as Google Maps. This presentation provides a technological overview of Google Maps, and discusses their implementation and potential application by USGS scientists. An example application will be given, illustrating the registration, construction, and display of a simple “mashup”.
The
Much progress has been made by USGS in making vast geospatial data resources available on-line to the public via web-based GIS applications. An area that remains less explored is the melding of these web technologies with large data warehouses to offer decision support and modeling capabilities. This framework seeks to provide an extensible platform from which applications can be built that 1) offer form based selections to filter data by theme or spatial extent, 2) integrate statistical methods to enhance visual display and, 3) serve base maps from existing web services to provide spatial context and comparisons selectable through familiar web form controls.
The main design criteria for the framework include the following characteristics. Geographically relevant Web mapping services (WMS) are offered as base map layers from sources including the National Map catalog. Using this framework, applications can be structured to meet users where they are in terms of web mapping familiarity by offering "saved" predefined themes that load up a package of base maps, thematic map instructions, and area of interest. For a selected theme and geographic region of focus, dynamically generated time-series plots can be created by the user to explore temporal variability and box-plots can be created to describe data distributions. Finally, the framework packages common export utilities including MS Excel and Google Earth KML. Google Earth exports preserve the symbology and point-based attributes of the theme.
The framework is built on the Java 2 platform using Oracle's Mapviewer and is tailored to work best with data warehouses stored in the Oracle RDBMS version 9i and above. Standards based service-oriented approaches were applied including J2EE, OGC web mapping services, National Map WMS, XML, Ajax, XHTML, and HTML.
With 15 years of experience using GIS software from ESRI, a background in html and website construction, an ancient background in computer programming, and a general “can do” attitude, I optimistically approached implementing an interactive map service for the United States Quaternary Fault and Fold Database project for the Earthquake Hazards Team. With the tender loving care of GIS professionals in the USGS and ESRI, system administrators to handle web delivery and security issues, and years of practice, we started serving the IMS to the public in January 2004. By January 2006 we were serving geodatabases through Oracle, ArcSDE, a PC server, multiple firewalls, and the National Web Server System (NatWeb).
The Good? The software now works as advertised. Users can get information off the web about faults that are potentially earthquake causing. Users can download the spatial and textual databases into a variety of applications. The software will keep on running for months with no attention, even while you edit the databases behind it.
The Bad? Attention to detail is critical in every step of the way. You can end up pulling your hair out over misplaced semicolons and spaces. Sometimes you can edit your AXL while the Administrator is running the project, save, and refresh the map service, and sometimes you have to stop and delete the service before you can save an edited file. It is terribly frustrating when you can’t get the software to act in a consistent manner.
The Ugly? Even if you do everything right, the software is so complex, it can still come crashing down around your ears and send you home hoping that “tomorrow is another day” (Scarlett in Gone With the Wind). And sometimes a reboot of all systems does fix the problem.
In fiscal year 2005, USGS entered into a Cooperative
Agreement with the
High-resolution LIDAR, orthorectified digital imagery,
and a fused product of these data are being exploited over part of the Gunnison
Gorge National Conservation Area (GGNCA) in western Colorado to fulfill several
research goals: 1) evaluation of advanced sensor data fusion capabilities; 2)
LIDAR feature extraction applications from data fusion products; and 3) topographic,
geomorphologic, geologic, and biologic science requirements and land management
needs of the Mancos Shale Landscapes interdisciplinary project involving the
U.S. Geological Survey (USGS), Bureau of Land Management (BLM), and several
other federal, state, and local groups. Using the LIDAR elevation data,
USGS scientists are quantifying high-resolution areas of unique slope and
aspect into highly accurate polygons of similar geomorphologic
characteristics. USGS and BLM hydrologists and soil scientists plan to
use these geomorphologic units and fusion-extracted features (vegetation,
trails, and disturbed surfaces) in runoff, sedimentation, and vegetation models
for the Mancos Shale selenium transport studies on the
NASA's Experimental Advanced Airborne Research LIDAR (EAARL) is a raster-scanning, temporal-waveform-resolving, green-wavelength LIDAR designed to map nearshore bathymetry, topography, and vegetative structure simultaneously. The EAARL sensor records the time history of the return waveform within a small footprint (15-20 cm at nominal flying altitude of 300 m) for each laser pulse, enabling characterization of canopy structure and 'bare earth' under a variety of vegetation types. EAARL data acquired over the coastal vegetated communities at Assateague Island National Seashore (ASIS) in Maryland, and Terra Ceia Preserve at the southeast coast of Tampa Bay, Florida, were used to evaluate the capability of LIDAR data to determine the vertical distribution of canopy and sub-canopy across a diverse set of vegetation classes. A collection of individual waveforms combined within a synthesized large footprint was used to define four metrics: canopy height, “bare-Earth” elevation (BEE), canopy reflection ratio, and height of median energy. The metrics derived from these composite waveforms were tested for accuracy and reproducibility. BEE values were derived from the individual waveforms to limit the spreading of the ground return on steep slopes and enable the ability to distinguish between ground and low shrubs. Results show that combining several individual small-footprint laser pulses to define a composite “large-footprint” waveform is a possible method to describe the vertical structure of a vegetated canopy.
Since the California Gold Rush of 1849 over 80% of the
tidal wetlands in the
The Geospatial Information Office (GIO) of the U.S. Geological Survey has assumed a leadership role within the Bureau in defining an overall Geospatial Information Science (GIScience) research agenda, in championing GIScience research as a component of the Bureau's science portfolio, and in conducting, supporting, and collaborating in research to address critical GIScience questions of importance to the USGS. Since the Bureau includes geography, water, geology, and biology disciplines, the role of the GIO in providing an integrating framework for information among these disciplines is an important element of GIScience at the USGS.
A
1) Provide leadership to identify, conduct, and collaborate on GIScience research issues of national importance
2) Provide timely, efficient, and intelligent access to new and archived USGS geographic data needed to conduct science and support policy decisions.
3) Develop innovative methods of modeling and information synthesis, fusion, and visualization to improve our ability to explore geographic data and create new knowledge.
4) Develop credible and accessible geographic research, tools, and methods to support decision making related to the human and environmental consequences of land change.
5) Assess, influence, and recommend for implementation technological innovations for geospatial data and applications.
6) Maintain world class expertise and leadership, and a body of knowledge in support of the NSDI.
CEGIS will also conduct, support, and collaborate in research to address critical geographic information science questions of importance to the USGS as a whole and to the broader geospatial community. As an outgrowth of and complement to this research program, the CEG will support and collaborate in technological innovations that further the implementation of the NSDI. A prioritized research agenda of GIScience issues of national importance will identify the most critical needs and provide a framework for future collaboration with other USGS disciplines and other government, academic, and industry partners. Since CEGIS expects to leverage resources through collaboration, the creation of this research agenda will be a joint undertaking among potential participants both within and outside the Bureau.
The Centers for Disease Control and Prevention (CDC) is
using GIS to protect
The findings and conclusions in this presentation have not been formally disseminated by the Centers for Disease Control and Prevention/the Agency for Toxic Substances and Disease Registry and should not be construed to represent any agency determination or policy.
This workshop will provide an overview, demonstration, and hands-on exercise using Data East’s XTools Pro 3.1 software. All participants will become familiar with the capabilities of XTools Pro, and any questions about the software will be welcomed and addressed by the presenters.
Large volumes of solid, gaseous, or liquid materials that are of potential concern from an environmental or public health perspective are commonly produced by extreme natural or anthropogenic events such as earthquakes, volcanic eruptions, forest fires, urban fires, landslides, hurricanes, tsunamis and other floods, windstorms, building demolition, and building collapse. The USGS can play a unique role in rapid-response characterization of materials generated by these types of extreme events. A broad spectrum of analytical capabilities spanning USGS regions and disciplines can be applied to help emergency response authorities and the public health community in their initial HAZMAT (hazardous materials) assessments immediately following the events. However, more importantly and more uniquely, USGS expertise can also provide important insights into a) sources of the materials, b) spatial dispersal of materials into the environment, c) how the materials may respond to environmental processes, and d) processes by which the materials may influence toxicity to exposed humans and ecosystems. Geospatial data and GIS technologies are crucial throughout all phases of any rapid response assessment, ranging from the initial response planning through interpretation of results.
The USGS has recently funded a Venture Capital project to investigate the feasibility of establishing a formal Bureau rapid-response capability for characterizing the mineralogy, geochemistry, microbiology, and ecological and human toxicity of dusts, other airborne constituents, and sediments produced by catastrophic natural or anthropogenic events. This talk will use examples from past or ongoing USGS rapid response characterization efforts (2001 World Trade Center, 2004-2005 Mt. St. Helens eruptions, 2005 hurricanes Katrina and Rita) to help illustrate the USGS role, examine lessons learned, and underscore future opportunities for truly interdisciplinary collaboration, including integration of GIS expertise and technologies.
The U.S. Geological Survey (USGS) has an
ongoing study to locate and characterize known deposits of natural asbestos in
the
Naturally occurring asbestos (NOA) has been
the focus of recent media attention in
To protect
The findings and conclusions in this presentation have not been formally disseminated by the Centers for Disease Control and Prevention/the Agency for Toxic Substances and Disease Registry and should not be construed to represent any agency determination or policy.
Background: Associations between
adverse health effects and environmental exposures are difficult to study
because exposures may be widespread, low-dose in nature, and common throughout
the study population. Individual risk-factor epidemiology may not be able
to initially identify an association. A series of multilevel,
multidisciplinary studies, starting with an inter-region comparison for the
purpose of hazard identification may be required. Existing databases
routinely collected by Federal Agencies can be used for this purpose.
Examples are provided in the following studies.
Methods: Information on mortality from
ischemic heart disease and diabetes during 1979-1988 and 1989-1998 (underlying
cause of death) was obtained from the
Results: Comparison of high- with low-wheat counties with adjustment for age, sex, year of death, and poverty index, showed that mortality from ischemic heart disease and diabetes was increased by 8% and 16%, respectively. Mortality from acute myocardial infarction, the major subgroup of ischemic heart disease, showed an increase of 20%. These results were statistically significant.
Conclusions: Because chlorophenoxy
herbicides are among the most widely used herbicides in the
Disclaimer: This is an abstract of a proposed presentation and does not necessarily reflect EPA policy.
Epidemiologists, veterinary medical
officers, and animal health technicians within Veterinary Services (VS) are
actively utilizing global positioning system (GPS) technologies in the field
for surveillance and emergency response efforts. Several geospatial
applications, including GPS receivers and GPS cameras, are used to obtain
locational data on livestock and poultry operations throughout the
The GIS and Geospatial Analysis group within the Center for Epidemiology and Animal Health (CEAH) has established minimum data accuracy standards for VS' GPS receivers and provides training to field personnel in data acquisition procedures to ensure that field-collected geographic coordinates are as accurate as possible. Coordinates collected in the field are validated by several methods, including overlaying coordinate points onto aerial photographs or geocoding facility addresses using a detailed road database, such as Tele Atlas.
This presentation will highlight and show examples of how VS is using geospatial analysis and modeling to support disease surveillance, risk assessment, and predict the spread of disease.
Although an epidemic/epizootic is a series of
host-parasite events in time and space, epidemiological data are often counts
of such events binned into regular periods and polygonal regions. We may
therefore apply autocorrelation techniques to understand changes in intensity
and variations in spatial complexity in order to model processes at local to
global scales. I have developed an S-PLUS/ArcGIS system that uses yearly
The Center for LIDAR Information Coordination & Knowledge (CLICK) was created to facilitate data access, user coordination, and education related to raw point cloud light detection and ranging (LIDAR) data for scientific needs. CLICK provides an easy point of contact for partners and potential partners to coordinate efforts on LIDAR data collection and data availability, which reduces costs to all interested parties. This effort will complement the National Digital Elevation Program (NDEP), a consortium of Federal agencies working together to facilitate the collection and use of high-resolution elevation data, as well as the National Elevation Dataset (NED), by allowing access to more elevation data than only bare-earth digital elevation models (DEM). All data collected through CLICK will feed NDEP discovery and acquisition activities and vice versa. The primary mission will not be purchasing LIDAR data, but collecting, processing, researching, distributing, and organizing LIDAR data that has already been acquired, as well as helping potential future collections. The CLICK Web page (http:\\lidar.cr.usgs.gov) is a virtual center where scientists and managers can pose and answer LIDAR-related questions. The Web page consists of a data viewer, a bulletin board, and a page for peer-reviewed journal references and Web links to help users find their own solutions. This virtual center provides access, information, coordination, and training for using LIDAR data, calling upon the expertise and knowledge of LIDAR research and projects throughout the U.S. Geological Survey (USGS) and beyond. This centralized coordination tool will allow for use of data not traditionally available to scientists and managers due to cost, access, or simply lack of communication. CLICK will be a powerful tool not only for the USGS, but for anyone interested in any aspect of LIDAR data.
1Management
Internet map services in their current form provide geospatial scientific data in only minimally understandable ways. This is because web map systems have been developed with a primary focus on the geographic content of the information, and typically make the assumption that the user possesses complete scientific knowledge of the data through previous experience. To be usable by people who do not already have specialized knowledge of the data presented, the web map service needs to be coupled with informative reference services (in addition to formal metadata) and link directly to web services that facilitate downloading the data for further scientific processing, possibly outside the GIS environment.
The USGS Mineral Resources On-line Spatial Data web site shows ways in which these concerns can be addressed using a combination of GIS, web, and information technologies. Scientific data can be shown on maps using standard web browsers, accessed through GIS clients, and downloaded selectively using separate database services. All of these are tied together through consistency of terminology and ready explanations.
The USGS, National Geospatial Technical Operations Center (NGTOC), has developed four National vector datasets for the National Map: hydrography, transportation, structures, and governmental units. Leveraging resources of multiple agencies to design and build the datasets, a system containing over 50 million features is in place to begin a process of continuous improvement through transactional data exchanges with local, state, and regional cooperators. In addition to our traditional requirements for mapping and resource management, the USGS has worked with ESRI and other cooperators to design a data model that encompasses the core data considered critical to homeland security efforts for disaster planning and response. These four data themes are a subset of this model and the first national implementation of this approach. The system design is a geodatabase data model running on ArcSDE 9.1/Oracle 10G with interfaces in ArcIMS 9.1 for data viewing and data download and in ArcGIS 9.1 for data packaging to shapefile, geodatabase, or coverage formats. The data are developed using tools ranging from ArcView 3+ to ArcGIS 9.1 with the Data Interoperability extension. The system architecture involves proxy, web application, and spatial servers set up in parallel processing paths for failover and load balancing. The databases are distributed over multiple systems consisting of working databases for change management and a separate system optimized for data distribution with public access. Other developments around the corner include automated two-way data synchronization with local and regional data centers and implementation of remote standby databases and systems for more robust failover and recovery capabilities.
2
The minerals information mission of the U.S. Geological
Survey is to collect, analyze, and disseminate information on the domestic and
international supply of and demand for minerals and mineral-based materials
essential to the
The USGS conducts more than 140 voluntary surveys on commodities ranging from abrasives to zirconium. These surveys cover both production and consumption of minerals and mineral materials and are sent to over 18,000 establishments. Publications, available at URL: http://minerals.usgs.gov/minerals/ include Minerals Yearbooks, Mineral Commodity Summaries, Mineral Industry Surveys, Metal Industry Indicators, and Nonmetallic Mineral Products Industry Indexes. While proprietary data such as individual company production amounts cannot be released to the public, regionalized studies can be undertaken to show production at state or county level. Thematic maps can be found in some of the Minerals Yearbook chapters. Internally, thematic maps of production were used in creating the generalized maps showing major producing regions that can be found in the Mineral Commodity Summaries. Spatial data are published through the National Atlas (URL:http://www.nationalatlas.gov), through the Mineral Resources On-Line Spatial Data Website (URL:http://mrdata.usgs.gov) and through two CD-ROMs. One is called the Aggregates Industry Atlas of the United States CD-ROM Version 2, which is a cooperative project with the National Stone, Sand and Gravel Association, and the other is a cooperative project with the Mineral Information Institute called Minerals in Your World Version 2.
The active mines and plants data set produced by the USGS is continually updated because the activity status of operations changes from year to year and because new methods are being developed to improve the quality of data through using multiple sources of data and cross-checking. Currently, the 2003 data set is being updated to 2004.
The National Land Cover Database (NLCD 2001) project is
under the supervision of the United States Geological Survey (USGS) in
cooperation with eight other Federal partners. Landsat-7 imagery provides
the foundation for the database that includes the following: (1) normalized
landsite imagery for three time periods per path/row, (2) ancillary data
including a 30 m DEM, slope, aspect, and a positional index, (3) per-pixel
estimates of percent imperviousness and percent tree canopy, (4) 21 classes
of land-cover data derived from the imagery, ancillary data, and derivatives,
(5) classification rules, confidence estimates, and metadata from the land
cover classification, and (6) a change detection product. These NLCD 2001
components provide data for many applications in areas such as fire fuels
mapping, watershed runoff modeling, wildlife habitat analysis, and hazards
modeling. This database employs a mapping zone approach, with 65 zones in
the continental
The U.S. Geological Survey (USGS) Center for Earth Resources Observation and Science (EROS) User Services and NASA Earth System Science, Land Processes Distributed Active Archive Center (LP DAAC) User Services, hosted at the USGS EROS in Sioux Falls, South Dakota, will present the following introductory presentation focusing on data specifications, availability, and access for the full suite of NASA LP DAAC and USGS remote sensing data products and services offered by EROS.
We will offer information on the data products available from NASA and the USGS. The NASA LP DAAC products that will be discussed begins with a brief overview of the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) instrument on board the Terra satellite and the Moderate Resolution Imaging Spectroradiometer (MODIS) instruments on both the Terra and Aqua platforms. Next, methods for locating, reviewing and ordering these data through the EOS Data Gateway (EDG) and Data Pool will be presented. We will conclude with an overview of a variety of tools available for manipulating data format, projection, and sub-setting, and providing data analysis and quality assessment. Examples of data output and data applications will be on display during the workshop. We will also offer data set specifications and information on a variety of other USGS EROS products.
The LP DAAC and USGS User Services are responsible for guiding the development and testing of data search and access tools, ensuring that products conform to the expectations of the user community, testing products and software compatibility, providing technical assistance to users, processing order and DAR requests, and provides outreach to our user community.
The U.S. Geological Survey (USGS) is responsible for providing the Federal Government with objective scientific information to support decisions regarding land management, environmental quality, and economic, energy, and strategic policy. To fulfill this responsibility, the USGS conducts geologic framework studies to periodically assess (1) the Nation’s oil, gas, and coal resources; and (2) resources in the principal petroleum provinces throughout the world.
A primary objective for the implementation of Geographic Information System (GIS) technologies by the Central Energy Resources Team (CERT) is to improve access to maps, assorted data sources, and other geospatial services. Because GIS improves the capability of decision makers to analyze layers of disparate data, the goal is to simplify discovery, access, and use of these geospatial data and services for USGS scientists, as well as for potential outside customers.
Use of GIS technologies by the CERT is enhancing research activities related to project workflow and information access and discovery by providing (1) efficient, centralized data management and data visualization; (2) ease in sharing data and interpretations among project personnel; and (3) dissemination of information and products to customers in an easily usable format.
CERT GIS activities include Internet Map Services and Metadata Services, which are also being leveraged in global networks that provide the infrastructure needed to support the sharing of geographic information. These portals include the National Spatial Data Infrastructure, the Geography Network, and the GeoSpatial One-Stop. Major tasks include thorough treatment of the technical issues related to application deployment, security, and system architecture. Demonstrations of the National Assessment of Oil and Gas (NOGA) Online, Gulf Coast Geology (GCG) Online, Gulf Coast Information Access System, and World Energy Assessment applications illustrate how interactive maps and publication services provide easy access to organized assessment results, geology, and other CERT project data and interpretations.
This poster presentation provides information on how the USGS is using Environmental Systems Research Institute (ESRI), ArcGIS tools to provide its scientists with groups of data layers that they can work with in either ArcMap or via their browsers. The technical aspects of GIS operations are advanced and complex. This poster presentation will not go into the details of all technical aspects, but will provide sources for such information.
The USGS, in cooperation with the Osage Nation,
Department of Energy, and U.S. Environmental Protection Agency, is
investigating the effects of hydrocarbons and produced water (brines) on soil
and ground and surface water. The study is focused on the natural
processes that may be mitigating effects of hydrocarbons and brines at two
sites adjacent to
A high-resolution DEM was created using topographic data collected using a dual-frequency kinematic Global Positioning System (GPS) receiver and Geographic Information System (GIS) surface-interpolation techniques. The GPS equipment consisted of a Trimble 4700 base station, transmission antenna, transmitter, GPS rover, and a data logger. The GPS rover antenna was fastened to a wheel-mounted data collection unit. A person pushed the wheel-mounted data collection unit and the GPS record a position every 5 seconds. Two sites (A-Site and B-Site), each about 20 acres, were surveyed over a four day period. A DEM with a 0.5 meter cell size was created using the Topogrid module in ArcInfo. Contours were created with a 0.2 meter contour interval.
Over 3,100 data points were collected at A-Site and 5,000 points were collected at B-Site, completed using the wheel-mounted data collection unit. Quality assurance was performed by 3-minute occupations of survey monuments, previously established at both the A-Site and B-Site. Results at A-Site showed that National Mapping Standard at a 1:1,500 scale could be achieved. Results at B-Site showed that National Mapping Standards at a 1:1,000 scale were exceeded. The additional topographic points collected at the B-Site allowed for better resolution of topographic features.
Numerous data have been collected by state and Federal
agencies in the Fort Cobb Watershed. However, to date, these data have
not been compiled in a form that can be adequately queried to evaluate the
effects of conservation practices. To aid in the decision-making process,
the USGS has developed a Digital Data Atlas for the Fort Cobb Watershed that
consists of spatial and environmental data in a 3,200 square mile 8-digit
hydrologic unit. The Digital Data Atlas of the Fort Cobb Watershed
contains 25 digital-map data sets and environmental data covering parts of
Beckham, Caddo, Canadian, Comanche, Custer, Dewey, Grady, Kiowa, and
Environmental data included in the atlas were retrieved from the USGS National Water Information System database. Data were collected by the U.S. Geological Survey, U.S. Army Corps of Engineers, U.S. Department of Agriculture, Oklahoma Conservation Commission, and the Oklahoma Department Health. These data include more than 150,000 measurements of surface-water and ground-water quality, stream flow, and ground-water levels between 1903 and 2005. Additional water-quality and biological data from the USGS Biological Resources Discipline were provided from a biological assessment study of the Fort Cobb Reservoir in 2000–2003.
Existing methods for determining stream slope using
Geographic Information System (GIS) hydrologic models and U.S. Geological
Survey digital elevation models (DEMs) will be compared with field measurements
of stream reaches. The objective is to compare slopes derived from various GIS
techniques and identify a method for estimating stream slope from DEMs that
best emulates field measurements. Stream slope is required when
calculating stream power and is used to measure the erosive capacity of moving
water. A stream segment with high slope (and thus, high stream power)
will often have a significantly greater amount of course substrate and provide
more heterogeneous habitat than a nearby segment with low stream slope.
Although field techniques for measuring stream slope are standardized such GIS
methods have the potential to expedite slope estimates over larger areas in a
consistent and efficient manner. By estimating slope values at different sites
throughout
The National Geospatial Programs Office has made a
commitment to its State and local partners to establish National Spatial Data
Infrastructure (NSDI) Partnership Offices and place a U.S. Geological Survey
(USGS) Geospatial Liaison in each State to support the community in developing
components of the NSDI. These offices are managed by the Regional
Geospatial Information Offices (RGIOs) and are collocated primarily with Water
or
In the role of fulfilling its mission of bringing biological information to the internet, U.S. Geological Survey’s National Biological Information Infrastructure Program (USGS-NBII), NBII nodes and NBII partners have developed dozens of live ArcIMS HTML Viewer and other mapping applications. While each application may be developed for users working within a particular biological issue or concern, the focus of NBII’s Geospatial Interoperability Framework is to provide users of each of these individual biological issue or concern based ArcIMS applications, the capability to discover and visualize additional biological content from within that application.
Therefore, NBII is building a Geospatial Interoperability Framework (NBII-GIF), leveraging the OGC specifications to allow for increased functionality for NBII’s end users. Using those specifications that provide practical, sustainable results in a production environment, NBII is realizing gains in productivity and data accessibility. The NBII Geospatial Interoperability Framework (GIF) is a comprised of a series of services and components designed to allow the NBII, its Nodes and Partners, and users, the capability of searching and discovering geospatially referenced biological resources.
NBII has completed the installation of the Phase 1 components, including the core of the GIF which is an OGC Catalog Server. NBII has developed a series of toolkits for accessing the OGC Catalog server and other geospatial services from existing ArcIMS HTML-based applications, for new rapid deployment of interoperable Internet Mapping applications. These toolkits allow NBII Nodes to create interoperable Internet Map Services within days, not the weeks or often months of normal development.
From within the applications built upon these templates, the NBII user can search from a list of available OGC catalog servers (such as NBII’s Catalog, NASA Catalog, or even the Geospatial One-Stop Catalog) or provide any other OGC catalog server URL, to find additional content. Users can dynamically add layers from the search of these OGC services and perform identify, zoom-in, zoom-out, show legend, show metadata, and perform other operations on it. This enhancement has provided the non-sophisticated users of these biological issue or concern based ArcIMS applications with the tremendous flexibility of performing data analysis not envisioned by the developers of those applications, all without the timely and cumbersome process of loading spatial data onto a server.
The NHDPlus Version 1.0 is an integrated suite of application-ready geospatial datasets that incorporate many of the best features of the National Hydrography Dataset (NHD), and National Elevation Dataset (NED). The NHDPlus includes a stream network, based on the medium resolution 1:100,000-scale NHD, improved networking, naming, and “value-added attributes”. NHDPlus also includes elevation-derived catchments that were produced using a drainage enforcement technique first broadly applied for the New England SPARROW model, and thus dubbed “The New-England Method”. This technique involves enforcing the 1:100,000-scale NHD drainage network by burning it into the NED and using the national Watershed Boundary Dataset (WBD), when available, to enforce hydrologic divides. The resulting modified digital elevation model is used to produce catchments (or areas that drain directly to each NHD flowline) that closely conform to the WBD. An interdisciplinary team from the U.S. Geological Survey (USGS), U.S. Environmental Protection Agency (USEPA), and contractors, over a two-year period (2003-4) found that this method produced the best quality catchments feasible in a relatively short time frame. In addition to catchment areas, land-cover categories, mean annual precipitation, and mean annual temperatures for each catchment have been computed. These catchments attributes have been accumulated using the NHDPlus flow network to compute cumulative attributes for each flow line in the network. The cumulative attributes have been used to compute estimates of mean annual stream flow volume and mean annual stream velocity for each flow line. These integrated geospatial datasets constitute a national geospatial surface-water framework—the NHDPlus.
U.S. Geological Survey (USGS) geospatial data is available from many sources inside and outside the USGS. The variety of Web sites and Web services delivering these data can involve many different formats, scales, and spatial extents and may present a challenge to users of Geographic Information Systems (GIS) technology. This workshop will provide GIS users with the practical skills and information they need to locate and access USGS geospatial data sets. Some of the most important data distribution sites will be highlighted, including the National Map Seamless Server, the National Atlas of the United States, the National Hydrography Dataset portal, Geospatial One-Stop, the USGS Publications Warehouse, and the GeoCommunity portal. Workshop attendees should be familiar with ESRI ArcGIS software so they can fully participate in a guided hands-on exercise in locating, downloading, and analyzing raster and vector geospatial data using ArcGIS Desktop version 9.1.
GIS professionals require a system that allows the automation of their workflows within a GIS in order to accomplish large tasks and reproduce their work. The combination of several hundred tools and the geoprocessing framework with ArcGIS fulfills this requirement. This session will explore the tools that may be used to solve hundreds of tasks that fall into categories such as analysis and data management. It will also show how to create workflows using ModelBuilder, an easy to use visual tool for creating custom workflows, and demonstrate the Python scripting environment, which gives users a powerful and flexible environment to accomplish their scientific computing needs.
ArcGIS provides a framework for the creation, integration, and sharing of spatial models and modeling processes. The first half of the session will present the framework, techniques and guidelines for the integration and sharing of spatial models. The second half will discuss how to use tools in ModelBuilder and scripts to do process and simulation modeling and also discuss what techniques can be incorporated into the modeling process to perform sensitivity and error analysis on model results.
The North Carolina Studies Act of 2004, Senate Bill 1152,
called for a plan to improve the mapping and digital representation of surface
waters in
In addition, the Hurricane Recovery Act of 2005
included funding to implement the plan for 19 counties in western
Multiple USGS disciplines and regions are involved in this effort, and this presentation will provide an overview of the plan and participation by the USGS for developing the local resolution NHD product. In addition, current techniques for generating the line work and geodatabase design will be outlined.
Availability of geospatial data to first responders is
critical across jurisdictional boundaries. The Project Homeland Colorado
Pilot, consisting of USGS, ESRI, and the State of
608-238-9449 rsvraga@usgs.gov
The USGS has realigned the geospatial programs for which it has a leadership responsibility into a National Geospatial Programs Office (NGPO) to serve the needs and interests of the geospatial community throughout the Nation. With the creation of the NGPO, the essential components of delivering the National Spatial Data Infrastructure (NSDI) and capitalizing on the power of place will be managed as a unified portfolio that benefits the entire geospatial community. The emphasis of the NGPO will be to engage partners throughout the geospatial community in its planning and in ensuring that its unified portfolio meets the needs of those on the landscape.
The NGPO, through the Geospatial Liaisons, is developing
partnerships for collection and access to digital orthophotos. Digital
orthophoto-imagery is an essential framework data layer and cooperative efforts
provide significant savings for all parties involved in acquiring and accessing
it. This presentation will discuss some of the different coordination
models that are being used in several states including
Understanding the latest developments in GIScience education can help USGS employees seek the best resources in their own professional development and more effectively partner with other organizations. Education and research have always been inextricably linked throughout the history of the USGS, beginning with John Wesley Powell's tenure as a public school teacher. Educational partnerships can lead to data and research partnerships with universities, professional societies, nonprofit organizations, and private enterprise. With new web mapping services appearing daily, online courses, tools such as Google Earth and NASA WorldWind, new textbooks, and government studies and funding, the field of GIScience education is rapidly expanding and changing. In 2004, the U.S. Department of Labor included geotechnologies in their list of the three fields that promised to expand the most during the 21st Century. In 2006, the National Academy of Sciences issued a report entitled "Learning to Think Spatially: GIS Across the Curriculum." This report provides another example of the increasing ties between GIS research and education and gives additional opportunities for partnerships. This session's presentation of these new resources and developments may enable USGS employees to advance their professional growth and partnerships in the field of GIS.
In September 2002, a Memorandum of Agreement was signed between NORTHCOM and the US Geological Survey (USGS). The agreement formalizes coordination and information sharing capabilities between NORTHCOM and USGS. Through this MOA USGS provides personnel to NORTHCOM who participate in day-to-day operations, real-world events, training/exercises and working groups at the command.
As part of the NORTHCOM DSCA mission, NORTHCOM is
interested in all hazards, man-made and natural. As such, USGS science, remote
sensing and GIS assist NORTHCOM and other Federal agencies develop situational
awareness and obtain natural hazard education/support. USGS scientists
and other personnel have supported NORTHCOM in events such as the volcanic
eruptions of
The National Hydrography Dataset (NHD) concept revolves around a nationwide partnership to produce and maintain a single source of high resolution (1:24,000) and local resolution (1:5,000) scale hydrography data. Initial data integration and creation have been a huge success and with 80% completion of high resolution, now is the time to continue that success in data maintenance. With NHD in the hands of so many active users, the sophisticated applications of these users have created a demand for an even greater level of data capability. The USGS is working with those agencies or groups that are interested in becoming a NHD data steward. Common needs and interests offer opportunities for partnerships to collect, maintain, access, and use basic spatial data among federal agencies and with other public organizations, notably state and regional organizations. The most direct benefit of shared maintenance is the ability to know about changes on the landscape and to receive spatial data that faithfully represents those changes.
This is a discussion of the importance of geospatial systems as a critical component of enterprise information technology in the context of fiscal year 2007 budget direction, evolving Office of Management and Budget (OMB) policy, and Federal Enterprise Architecture (FEA). As organizations move towards compliance with these new policies and guidance, geospatial requirements must be considered as part of the overall architecture and evaluated in an integrated fashion with all other mission information systems. Service-Oriented Architecture (SOA) is discussed, highlighting the benefits and agility of this architecture and the critical importance of being able to clearly articulate the geospatial lines of business as called for in the President's budget.
This session will present the software products and services of the company Data East. This session will give the participants some general ideas about the company: who they are, what they do and what they want to contribute to the USGS Workshop. Then a more detailed overview of Data East’s software products will be presented, including XTools Pro, Personal Internet Map Server, TAB Reader for ArcGIS, and Smart Search for ArcGIS.
There are numerous challenges associated with DEM
extraction from Advanced Spaceborne Thermal Emission and Reflection Radiometer
(ASTER) sensor data in the mountainous terrain of
Previous studies demonstrate the ability to derive 30 m DEMs from ASTER for a variety of applications. Few studies focus on regional-scale projects where multiple ASTER DEMs are edge matched and mosaicked. Even fewer have assessed DEMs developed in high mountainous terrain. Our study presents the challenges and solutions associated with regional-scale DEM production in high mountainous terrain, and how terrain factors, slope and the pointing angle of ASTER’s VNIR sensor affect the output elevation product.
For this study, 86 absolute DEMS were created and extracted using PCI Geomatica’s Orthoengine software. The resultant DEMs exhibited common errors from the stereo-auto correlation process that occurred in image areas corresponding to cloudy and snow covered areas, lakes, steep slope areas, and southeastern facing slopes. As a result of these features, poorly correlated elevation values produced erroneous holes, large pits and spikes in the initial elevation model output.
Preliminary analysis was performed on the slope values of the 90 m SRTM data and the ASTER scene metadata. The results indicate that erroneous elevation values corresponded with steep slopes and scenes collected with high, off-nadir pointing angles. To address these errors, multiple scenes were acquired with low off-nadir pointing angles and overlapping DEMs were produced and mosaicked to fill void areas. In addition, a progressive morphologic filter was applied as a post processing step to remove pits and spikes. These post-processed and mosaicked DEMs produce more accurate and visually appealing elevation models for landform classification, geologic structure analysis, and natural resource assessment applications.
High-resolution digital elevation models (DEMs) can be derived from stereo aerial photographs or satellite imagery by utilizing digital photogrammetric and stereo-autocorrelation techniques. Stereo auto-correlation algorithms measure the amount of parallax and calculate elevation values on a pixel-by-pixel basis for all pixels matched in a set of stereo images. Poorly matched pixel values result in erroneous elevation values or failed values which are exhibited in the output DEM as pits, spikes, and void areas.
To improve the quality of output DEMs, most software routines employ a low pass filtering technique to smooth elevation values. This technique reassigns a mean elevation value for a 3x3, 5x5, or 7x7 pixel window around all cell values. Calculating a mean elevation in a window containing a large pit or spike biases the values of all cells within that window and reduces, but does not eliminate the erroneous value. A progressive morphological filter was developed to target and filter only erroneous pit and spike data values in raw DEM data produced from a stereo auto-correlation process.
The progressive morphological filter iteratively compares individual raw elevation values to a set of focal neighborhood statistics and a user defined threshold value. Elevation differences between the raw value and the neighborhood statistics are compared to the threshold value. Raw values that exceed the threshold are replaced with a focal minimum, focal maximum, or focal median value based on the characteristics of the elevation value in question. The filter progresses through four stages whereby elevation values are compared to increasingly smaller neighborhoods and a progressively reduced threshold value. The result is that only elevation values that exceed the defined parameters are replaced; all other values remain unchanged and the overall output quality is improved without degrading the high resolution fidelity of the DEM.
The U.S. Geological Survey (USGS) Geography Discipline,
in collaboration with the National Weather Service, the Environmental
Protection Agency, and the USGS Water Discipline, generated the initial version
of the Elevation Derivatives for National Applications (EDNA) database in
2000. A snapshot of the National Elevation Dataset (NED) was processed to
produce an elevation surface with filled artificial depressions, flow
direction, flow accumulation, slope, aspect, compound topographic index,
synthetic reaches and their catchments, and shaded relief. Over the past
five years, EDNA has been used in many applications, including a national
low-head dam power assessment, national flash flood warning system, and
objective national sampling designs. These applications have facilitated
the addition of precipitation, temperature, land cover, and other variables
into the EDNA database as flow-accumulated variables. EDNA Web-based
tools have been developed to visualize the data, compute and download
watersheds for any point in the conterminous
The creation of a single, detailed cartographic database supporting map representations at multiple scales and for multiple purposes continues to challenge the GIS discipline. National mapping agencies compile geospatial data to meet standardized ‘anchor’ map scale and cartographic design specifications. These specifications reflect historic conventions for data capture developed within the constraints of targeting agency mission and driven by agency intention to balance data usability with data production costs. Digital Line Graph (DLG) data form a vector base mapping framework for many GIS applications across a continuum of scales, with data concentrated around 1:24,000 1:100,000 and 1: 2,000,000. A gap in the continuum occurs roughly between 1:300,000 and 1:700,000, within which mapping experiments show that geoprocessing and/or symbol redesign of the DLG data are not sufficient to produce a full range of mapping products. To generate a fully operational multi-resolution vector database (MRDB) for base map data production, that gap needs to be filled.
This paper explores a relational database experiment
fusing DLG and VMAP (DIGEST) data schemas for a geographical footprint in
southern
One of the goals of the USGS Coastal and Marine Program is a national assessment of coastal change hazards, which includes coastal vulnerability to extreme storms and hurricanes. By quantifying the magnitudes of storm-induced coastal change and identifying the processes responsible for this change, we work towards the capability to predict beach response to an approaching storm. Over the past ten years, the USGS and our partners at NASA and the U.S. Army Corps of Engineers (USACE) have acquired pre- and post-storm laser altimetry (LIDAR) surveys that provide an unprecedented topographic data set for examining storm impacts. Together with aerial video and still photography, the LIDAR surveys are used to characterize the nature, magnitude, and spatial variability of coastal change resulting from hurricanes.
Hurricane Katrina, which made landfall as a category 4
storm in Plaquemines Parish, Louisiana on August 29, 2005, resulted in scales
of hurricane-induced coastal change that are rarely seen. On August 31
and September 1, 2005, the USGS, NASA, USACE, and the
The Maryland Department of Natural Resources (MD-DNR)
Coastal Zone Management group requested assistance of USGS in preparing a
simple modeling dataset to support local planning and first responders in
coastal flooding scenarios for the 2005 Hurricane Season. 1:1200-scale
Orthoimagery and 18cm vertical resolution LIDAR were provided from MD-DNR, and
combined those Framework datasets with Federal Emergenncy Management Agency
(FEMA) Q3 delineations and NOAA average storm surge height for four categories
of hurricanes. The 2004 Orthoimage data and 2003 LIDAR data were
geo-registered, which provided a data assemblage that local users could predict
basic flood levels which
The Federal Emergency Management Agency has developed HAZUS (Hazards U.S.), a standardized GIS-based, nationally applicable natural hazards loss estimation model. HAZUS-MH is a powerful risk assessment software program for analyzing potential losses from floods, hurricane winds and earthquakes. In HAZUS-MH, current scientific and engineering knowledge is coupled with the latest Geographic Information Systems (GIS) technology to produce estimates of hazard related damage before, or after, a disaster occurs. HAZUS-MH takes into account various impacts of a hazard event such as: physical damage: damage to residential and commercial buildings, schools, critical facilities, and infrastructure; economic loss: lost jobs, business interruptions, repair and reconstruction costs; and social impacts: impacts to people, including requirements for shelters and medical aid. The presentation will concentrate on describing the more than 200 data layers that were obtained and developed to support the loss estimation methodology.
.
This talk will present a historical perspective of GIS development in USGS, with emphasis on GIS use in water resources. It is intended to show how some of the GIS activities and practices in USGS came to be the way they are. Some of the early efforts in digital mapping at USGS set the stage for modern GIS development by providing a critical base of maps. Within the (then) Water Resources Division, however, GIS came to be seen as an analytical tool. A training program focused on teaching fundamental GIS principles and helped to develop a cadre of scientists who applied GIS to ground-water modeling and other water studies. Although growing computing power played a role, key advances were almost always marked by an intellectual insight into a scientific problem. The lesson is that brain power still counts!
This 2-hour workshop is intended to provide Geographic Information System practitioners and resource modelers with an opportunity to learn about NHDPlus for use in their water resources applications. NHDPlus is a suite of geospatial products that build upon and extend the capabilities of the National Hydrography Dataset (NHD) by integrating the NHD with the National Elevation Dataset and the Watershed Boundary Dataset (where it exists). NHDPlus includes improved NHD names and networking; value-added attributes (such as stream order) that enable advanced query, analysis and display; elevation-derived catchments that integrate the land surface with the network, associated flow direction and accumulation grids; and annual stream-flow and velocity estimates for use in SPARROW and other pollutant models.
One of the most important kinds of spatial analysis is analysis of trends and patterns in geospatial data. A common application is the development of a continuous surface from samples of that surface in the form of point locations or contours. Different techniques address this problem from both deterministic and statistical approaches. This workshop will discuss several methods of surface interpolation and demonstrate how they can be applied using ArcGIS software.
This session will present interactive 3-D visualization of GIS data using 3-D Analyst’s ArcGlobe and ArcScene Applications. Topics will include data preparation guidelines, display optimization techniques, best practices and recommended workflows in building interactive 3-D documents and animations, and general tips and tricks to achieve maximum usability. This session will also cover new 3-D functionalist that will be released with ArcGIS 9.2.
The U.S. Geological Survey (USGS)
206-220-4566 dsorenson@usgs.gov
Information about land use and land cover change is
important in addressing ecosystem health and for future land use planning, but
data on a national scale are scarce. The Land Cover Trends Project,
sponsored by the U.S. Geological Survey’s Geographic Analysis and Monitoring
(GAM) program, is a national effort that describes the rates, driving forces,
and consequences of land cover change on an ecoregion basis between 1973 and
2000. Results for the Puget Lowland, one of the 84 ecoregions
determined by the U.S. Environmental Protection Agency’s Level III ecoregions
of the
Stream Gages are highly significant features in the analysis of a stream network. Fortunately, two GIS databases can be brought together to make the analysis of dams a feasible and efficient process. The National Hydrography Dataset (NHD) provides a database of the nation’s surface water that includes flow modeling and linear addressing capabilities. The National Water Information System (NWIS) provides a database of stream gage locations and characteristics. By snapping and addressing these Stream Gages to the NHD, the stream gages can be located on the appropriate flow path, greatly enhancing their value to GIS analysis by allowing their detection in upstream or downstream searches. Using linear addresses with the NHD logical flow table, it is possible to create sophisticated queries to search and analyze highly unique scenarios. Experiences learned in indexing the NID demonstrate the capabilities and limitations possible with this dataset. Examples of query techniques also demonstrate the level of sophistication that can be achieved.
Watershed characteristics are an important part of
evaluating conditions monitored by water-quality and streamflow networks.
These conditions can be related to the natural and human factors that affect
the deposition, transport, and delivery of nutrients and sediment to estuaries
like the
National Hydrography Dataset Plus (NHDPlus) linework and associated sub watersheds (catchments) are being used in the U.S. Geological Survey’s (USGS) National Water-Quality Assessment (NAWQA) program to accelerate the completion of tasks that previously have been extremely time-intensive, such as drainage area delineation and site matching for water-quality sites. The lines from the NHDPlus represent stream networks and are corrected so all are pointing upstream, and a one-to-one relation exists between each stream segment and the sub watershed that drains to it. In basin delineation, the outflow point (usually a streamflow gage) is used to initiate a trace of all upstream reaches, collecting the catchments as the stream network is traced. This process is being used to automate the delineation of hundreds of basins much more quickly and accurately than using elevation data alone.
The NHDPlus is extremely valuable in matching drainage characteristics of two separate data sets. The study area includes a large number of water quality monitoring sites without an associated streamflow gage. Drainage area ratio was used as the criterion for a goodness of match between monitoring site and streamflow gage; a paring with a ratio between 0.75 and 1.25 was considered an acceptable match. The trace command was automated to find the closest NHD stream to the ungaged site, collect the associated catchments, and calculate the total drainage area. If a steamflow gage was within the upstream or downstream catchment areas, drainage areas were compared to see if they were within the threshold. The program creates screen snapshots of each site and a file listing potential matches. The NHD-Plus enables accurate basin delineation and site matching more consistent with national data.
GIS is often used as an important tool in developing ground-water vulnerability models and corresponding maps, which are valuable for ground-water resource management and planning. In this study, which is an expansion of a pilot study, ArcGIS Desktop and Workstation were used to extract geospatial data from various large data sets for input to a logistic regression model of ground-water vulnerability and to produce a corresponding vulnerability map. The map illustrates the predicted probability of recently recharged (defined as less than 50 years) ground water of the High Plains aquifer to non-point source nitrate contamination. Spatial data from 31 individual vector and raster layers were extracted for each of 6,946 well locations throughout the study area. These layers included information about saturated thickness, depth to water, precipitation, percent irrigated/nonirrigated/agricultural land, nitrogen applications, soil characteristics, lithology of unsaturated zone, playa lakes, and water use. Extractions for categorical data and certain continuous data sets were performed using a series of identity overlays, directly from the layer at the location of each well. Ninety-degree sectors of varying radii, which was determined by hydraulic conductivity, were created upgradient from each well for extraction of data where information needed to be related to a well location. These variables were inventoried for the 90-degree sector areas using both vector-union techniques and raster map-algebra techniques. The extracted data were used as input for logistic regression analyses to determine which of the variables (layers) or combination of variables were significantly correlated with observed water-quality conditions and would be used in a model of the probability of predicting nitrate concentrations greater than 4 milligrams per liter (as N) in ground water. Five variables were considered significant (depth to water, organic content of soils, amount of irrigated/nonirrigated land, and the amount of clay in the unsaturated zone) in the final two models that were developed to represent different regions of the study area. The appropriate GIS layers were converted to raster data sets in order to use the map algebra capabilities of ArcGIS. The two model equations and the various coefficients for each layer were fed back into ArcGIS and, using map algebra, the probability surface was calculated and then easily visualized across the entire study area with GIS.
Researchers and managers have begun to use spatially explicit tools for visualizing and understanding the impacts of human and natural disturbances on forest ecosystems at a landscape scale. Spatial modeling with a GIS provides a visual, intuitive interface that makes it possible for scientists to (1) present different management options and (2) demonstrate how these scenarios play out in a dynamic landscape. The landscape modeling approach allows scientists and managers to provide feedback both in the development of alternative management scenarios and in the refinement of the spatial models.
The Landscape Scenario Analysis Project, initiated and
funded by the Bureau of Land Management (BLM), is facilitating interactions
between scientists from the USGS Forest and Rangeland Ecosystem Science Center
and BLM managers regarding broad-scale implications of vegetation management
for forest ecosystems in western Oregon. Landscape scenarios have been
developed for BLM lands in the
Spatially explicit modeling is performed with ESSA Technologies’ Tool for Exploratory Landscape Scenario Analyses/Vegetation Dynamics Development Tool (TELSA/VDDT), a software package developed specifically for modeling interactions between natural disturbances and forest management. The spatial module in TELSA integrates directly with a geographical information system (ArcView 3.x) and provides the functionality to build models using existing spatial data layers such as forest stand characteristics, planning zones, and disturbance-prone areas. The flexibility of TELSA/VDDT makes it possible to develop model structures suitable for different ecological settings and management applications. The TELSA/VDDT software package also contains numerous mechanisms to control management and disturbance at landscape scales based on mapped characteristics (e.g., fire regime, rain-on-snow zone, Northern spotted owl home range, aquatic large wood source area). Landscape scenarios can be represented at a very coarse resolution initially and then refined over time as skills and insight are acquired.
Maps and graphs of temporal trends generated by the models have helped managers visualize the effects of natural and management disturbances on forest landscape and stand structure, and were specifically used in BLM planning and consultation processes. In particular, results from the model were used to help assess cumulative effects from timber harvest since data were compiled across ownerships and land-use allocations.
The U.S. Geological Survey’s (USGS) Comprehensive Urban Ecosystems Studies (CUES) initiative is a component of The National Map and is an integrated venture of Geography’s Geographic Analysis and Monitoring, Land Remote Sensing and Cooperative Topographic Mapping Programs, and other USGS, Department of the Interior, Federal, State and local partners. CUES utilizes USGS data and science expertise to provide an internet environment for decision support tools and other practical applications to help meet critical needs facing the Nation’s urban areas. Issues addressed by CUES cover a range of environmental concerns.
There are currently four East Coast urban areas
participating in the CUES initiative, including
The newly released Mapguide Open Source Application, a
collaboration between Autodesk and
This session is an introduction to the ArcGIS Geostatistical Analyst extension, performing exploratory spatial data analysis, variography, with an overview of the available interpolation options, and some discussion about new features coming in version 9.2.
This 2-hour workshop is intended to provide Geographic Information System practitioners and resource modelers with an opportunity to learn about NHDPlus for use in their water resources applications. NHDPlus is a suite of geospatial products that build upon and extend the capabilities of the National Hydrography Dataset (NHD) by integrating the NHD with the National Elevation Dataset and the Watershed Boundary Dataset (where it exists). NHDPlus includes improved NHD names and networking; value-added attributes (such as stream order) that enable advanced query, analysis and display; elevation-derived catchments that integrate the land surface with the network, associated flow direction and accumulation grids; and annual stream-flow and velocity estimates for use in SPARROW and other pollutant models.
One of the most important kinds of spatial analysis is analysis of trends and patterns in geospatial data. A common application is the development of a continuous surface from samples of that surface in the form of point locations or contours. Different techniques address this problem from both deterministic and statistical approaches. This workshop will discuss several methods of surface interpolation and demonstrate how they can be applied using ArcGIS software.
This session will present interactive 3-D visualization of GIS data using 3-D Analyst’s ArcGlobe and ArcScene Applications. Topics will include data preparation guidelines, display optimization techniques, best practices and recommended workflows in building interactive 3-D documents and animations, and general tips and tricks to achieve maximum usability. This session will also cover new 3-D functionalist that will be released with ArcGIS 9.2.
AccessibilityFOIAPrivacyPolicies and Notices | |