Novel Data in Recreation Monitoring—Summary Proceedings from Interagency Workshops in 2019 and 2023

Scientific Investigations Report 2024-5013
Land Management Research Program
Prepared in cooperation with the U.S. Department of the Interior Office of Policy Analysis, U.S. Department of Agriculture Forest Service, and University of Washington
By: , and 

Links

Acknowledgments

The authors would like to thank internal reviewers Drs. Monika Derrien (U.S. Department of Agriculture Forest Service) and Danielle Schwarzmann (National Oceanic and Atmospheric Administration) for feedback on this report. We are also appreciative of all workshop attendees for sharing their knowledge, and we give a special thanks to everyone who gave presentations at the workshops.

Abstract

Two interagency workshops were held in 2019 and 2023 in Fort Collins, Colorado, to discuss the use of novel data in recreation monitoring. During the workshops, the phrase “novel data in recreation monitoring” was primarily used to refer to data from social media, mobile device applications, and other online secondary sources. The goals of these workshops were to share information across agencies and researchers on the state of the science and applications for using novel data and to collectively discuss best practices for using novel data for understanding recreation on public lands and waters. Presentations during the workshops focused on use-cases, current applications, and the current state of research (as of the time of the workshops) for using novel data in recreation monitoring. Group discussions during the workshops focused on the strengths and limitations of novel data sources, potential approaches for integrating new and emerging data sources and methods with traditional approaches, and research and management needs. This report provides the proceedings of the 2019 and 2023 interagency workshops on novel data in recreation monitoring.

Background

In 2019 and 2023 in Fort Collins, Colorado, the U.S. Geological Survey (USGS) Social and Economic Analysis Branch, U.S. Department of the Interior (DOI) Office of Policy Analysis (PPA), U.S. Department of Agriculture Forest Service (FS) Pacific Northwest Research Station, and University of Washington Outdoor Recreation and Data Lab organized Federal interagency workshops on using novel data for recreation monitoring. These meetings followed an interagency meeting in 2017 focused on Federal agency recreation monitoring systems and have provided an opportunity for the developing recreation monitoring community to discuss issues of interest, in particular the appropriate use of emerging data sources for understanding the amount and character of recreation use on public lands. During the workshops, and throughout this report, the phrase “novel data in recreation monitoring” is primarily used to refer to data from social media, mobile device applications, and other online secondary sources (for example, reviews or trip reports). Other sources of novel data, including satellite imagery and community science, were briefly discussed during the workshops but were not a focus of the discussions.

Discussion during the workshops focused on the state of the science of using novel data in recreation monitoring and research, strengths and weaknesses of novel data sources, potential approaches for integrating new and emerging data sources and methods with traditional approaches, opportunities for coordination and partnership in testing and using novel data, and research and management needs. Across the two meetings, participants represented Federal agency staff with responsibility for recreation monitoring programs, Federal and academic researchers studying the use of novel data, nonprofit organizations engaged in visitor monitoring and research, and private sector technical service providers engaged in visitor monitoring and research using emerging data sources.

This report provides a summary of presentations and discussions that took place during the 2019 and 2023 interagency workshops on novel data in recreation monitoring. We begin with the proceedings from the 2023 workshop because this provides the most recent and up-to-date information from the two workshops, and after this, we provide proceedings from the 2019 workshop for additional context. Because the 2017 workshop was not focused on the use of novel data, we do not include information on the 2017 workshop in the body of this report. However, additional information on the 2017 workshop can be found in appendix 1.

2023 Novel Data in Recreation Monitoring Workshop Summary

The DOI’s PPA, USGS Social and Economic Analysis Branch, FS Pacific Northwest Research Station, and University of Washington Outdoor Recreation and Data Lab hosted a followup to the 2017 and 2019 interagency workshops on recreation monitoring. This workshop was held on February 1 and 2, 2023, at the USGS Fort Collins Science Center in Colorado. Participating Federal agencies consisted of the FS, the Department of Commerce’s National Oceanic and Atmospheric Administration (NOAA), the Environmental Protection Agency (EPA), the DOI’s PPA, and the following DOI Bureaus: the Bureau of Land Management (BLM), the National Park Service (NPS), the U.S. Fish and Wildlife Service (FWS), and the USGS. Participating universities consisted of the University of Washington and Clemson University. A full list of participants can be found in appendix 2.

The purpose of this workshop was to convene researchers and practitioners with experience and interest in the use of emerging data sources (such as social media and mobile phone locations) in recreation monitoring. The goals of this workshop were to share knowledge across agencies on the state of the science and applications and collectively develop best practices to advance the use of novel data for understanding the amount and character of recreation on public lands and waters.

The first day of the workshop started with a review of the main takeaways from the past meetings and then consisted mostly of presentations. The first of these was a “State of the Science” presentation to get all participants up to date on the current state of research using novel data for recreation monitoring. Agency representatives then gave “lightning-round” talks about how they are using novel data or hoping to use them in the future. In the afternoon, there was a breakout session to discuss outstanding examples or areas where an example is needed and then four use-case presentations from researchers who have evaluated the use of novel datasets. The second day of the workshop included breakout sessions to discuss concerns about using novel data and future research needs. There was also a panel discussion on working with mobile device data and a working session to discuss directions and needs for an interagency best practices document. Summaries of each session are detailed in the next 10 sections. The full agenda for this workshop can be found in appendix 3.

First Session—Recap of the Main Takeaways from Past Meetings (2017 and 2019)

Christian Crowley (PPA) reviewed the main takeaways from the past two meetings on novel data in recreation monitoring. The main takeaways from the 2017 meeting were as follows: (1) there was variation across traditional approaches that agencies use to estimate recreational visitation and the economic benefits of visitation; (2) there was a desire to increase consistency among agencies; and (3) there was interest in exploring novel data sources, including social media, mobile device data, and remote sensing technologies. The takeaways from the 2019 meeting were as follows: (1) there is a mature research effort underway that is exploring the use of social media data; (2) scientific researchers concluded that user-generated data (for example, social media) are reliably correlated with visitation; and (3) these datasets are worth continuing to explore, and it is helpful to have continued discussions as a community of research and practice.

“State of the Science” Presentation

The 2023 workshop began with a presentation by Spencer Wood (University of Washington), Emily Wilkins (USGS), and Eric White (FS) on the current state of the science regarding the use of novel data for recreation monitoring. The goals of this session were to (1) build on existing knowledge and experience, (2) increase familiarity with traditional and novel data sources, (3) outline known and potential issues with user-generated data, (4) describe empirical evaluations of data and methods, and (5) establish current best practices.

The presentation began with an overview of definitions and concepts and then described the state of the research on using novel data to estimate visitation, visitor characteristics, and information about visits. Definitions, concepts, and examples used during this presentation and workshop include the following.

Novel data.—Examples of novel data include geolocated social media, mobile phone use, and community science submissions. These are sometimes referred to as “volunteered data” or “user-generated data.”

Useful data.—Useful data are data that are accurate, understandable, accessible, ethical, and help answer a relevant question.

Social media.—Social media from internet-based sites that recreation visitors use to publicly post information and images related to recreation visits can be useful for recreation monitoring. Examples of applications include Flickr, Strava, Instagram, AllTrails, and other online trip reporting platforms (for example, Tripadvisor or local online communities). Data from social media applications can include, for example, text content, images, geotagged locations, and time stamps.

Mobile phone location.—Many users enable location services on their mobile phones, and in the process, give their consent for various phone applications to track their locations and sell these data to third parties. Third parties may also process these data and sell them to researchers.

Community science.—Examples of community science include eBird (where users can upload bird sightings) and a community science program that allows people to contribute information about their visit to public lands. Visitors send text messages to a phone number that is connected to a bot that asks questions about their visit.

Traditional (onsite) data sources.—Examples of traditional (onsite) data sources include trail counters, visitor intercept surveys, and trail cameras.

A growing body of research during the last decade has evaluated the use of social media data for estimating visitation rates (for example, Wood and others, 2013; Keeler and others, 2015; Sonter and others, 2016; Tenkanen and others, 2017; Hamstead and others, 2018; for reviews refer to Wilkins, Wood, and Smith, 2021 and Ghermandi, 2022). In many locations, studies show there is a strong correlation between social media posts and traditional data sources, like trail counters. This information about the relative levels of visitation (for example, comparing across time or parks) does not necessarily tell us absolute levels of visitation. Emerging questions include how to estimate absolute numbers of visitors and which data sources are most useful. Two studies have addressed these questions by creating models of visitation using a variety of predictors (Merrill and others, 2020; Wood and others, 2020). These studies took place on public lands in western Washington and New Mexico (Wood and others, 2020) and public beaches in New England (Merrill and others, 2020). Both studies show that visitation models parameterized with traditional data perform better when novel data are included as predictors. Key findings from this research are that visitation models are improved when multiple novel data sources are used in concert and that the models should be calibrated with local, onsite counts.

Some research has also evaluated if novel data can be used to estimate visitor characteristics, including home locations and demographics. Home location can be inferred from social media profiles or frequent posting locations. Using mobile data, home location can be inferred from the phone’s most frequent overnight location, among other methods. Visitor demographics are usually inferred either by using artificial intelligence to evaluate the content of uploaded images (Mashhadi and others, 2021) or by relating home location with census data. Studies evaluating the accuracy of these types of data compared with traditional survey methods include Sessions and others (2016); Heikinheimo and others (2017); Fisher and others (2019); Sinclair and others (2020); and Liang and others (2022). Results indicate that in some situations, home location inferred from social media data may reflect a potential bias depending on who is posting. For instance, international visitors are more likely to post photographs of their trips, whereas local visitors are less likely to share photographs (Sessions and others, 2016; Sinclair and others, 2020; Wilkins and others, 2022). Additionally, although researchers often can get home locations from mobile device data, they may not be able to differentiate between recreational visitors and others in the area not for recreation (Liang and others, 2022). For this and other reasons, the estimated demographics of park visitors from census data may be inaccurate, though to date, there is less research on this (Monz and others, 2021; Liang and others, 2022).

Researchers have also evaluated if novel data can answer questions about a visit, such as what activities visitors engaged in and how satisfied they were with their visit. For inferring activities, researchers have relied on several methods, including image content analysis—either manually or using artificial intelligence tools to process images shared on social media. Available research indicates that using social media images and machine learning (a convolutional neural network) to infer activities can be accurate for some activities, such as hiking and bird watching, but less accurate for other activities, such as backpacking and swimming (as indicated in a test at two national forest sites in Washington) (Winder and others, 2022). Winder and others (2022) note that the specific convolutional neural networks are generally better at recognizing activities that have clear and recognizable objects (for example, bird watching) but are less able to distinguish between activities with similar equipment (for example, hiking and backpacking). There is also likely some bias in which activities are photographed and shared (Winder and others, 2022). Content posted on social media can be used to assess visitor sentiment (refer to Pickering and others, 2020), though this requires a large amount of text processing even for a coarse evaluation of emotion (for example, positive, neutral, or negative).

“State of the Application” “Lightning-Round” Talks

After the “State of the Science” presentation, a representative from each agency gave a 10-minute presentation or update regarding current (as of February 2023) uses of novel data sources, or how novel data might help answer agency questions in the future. The main points by agency representatives are summarized as follows.

NOAA.—The NOAA has used human mobility data for several projects and continues to expand their application in marine areas. For instance, the NOAA used mobile data from a vendor named Unacast to analyze spatial and temporal visitor-use trends within the Middle Peninsula in Virginia with an overall goal of understanding human pressures on reserve sites and potential barriers to access. With West Virginia University, the NOAA completed a pilot project in the Florida Keys National Marine Sanctuary that used Instagram data and survey data to estimate trips for different activities (for example, diving and fishing). They are planning to incorporate multiple datasets (including apps, surveys, and satellite imagery) for visitor monitoring as part of the socioeconomic monitoring plan for the Mission: Iconic Reefs.

BLM.—The BLM has not used novel data sources to date but is starting to explore the potential (for example, using mobile device data to help estimate visitation). They have a key goal of supporting staff whose other duties may preclude monitoring visitation using traditional methods. The BLM manages a vast land area with many low-use or dispersed sites, and there is often little to no visitor data for these types of locations.

FWS.—The FWS has efforts underway to improve approaches to visitor estimation, including a national visitor survey through 5 years at several national wildlife refuges. The FWS also has an ongoing project with Clemson University and the University of Washington to test methods and data sources for estimating visitation. Phase 1 of this project consisted of a literature review and Delphi panel of experts to compare attributes of visitor estimation methods. Phase 2 is ongoing and involves developing a visitation model for refuges using novel data and on-the-ground data to parameterize the model. Research questions include how to scale up a model and if it can be applied nationally to all (or some) refuges.

FS.—The visitor monitoring program for the FS is the National Visitor Use Monitoring (NVUM) program. Through this program, the FS collects onsite data every 5 years to create visitation estimates and describe visitor characteristics. One major benefit of these data is that they are a long time series with more than 20 years of data. Although the FS is interested in novel data, there is some concern about losing the value of many years of consistently collected data and losing the type of information provided by visitor surveys (or other original data collection efforts) if it were replaced by novel data in the future. The FS is forming an advisory group related to the use of novel data.

NPS.—The NPS is testing the use of novel data to determine where and how they can be applied. To date, the NPS has completed studies at five parks to test the use of mobile and connected-vehicle data. One of the main findings is that mobile data are potentially useful but need more validation and cannot replace traditional counts. Additionally, the cost of using mobile data is high and difficult to estimate in advance. At the time of this workshop, the NPS views novel data as a “value-added dataset” for calculating or correcting multipliers or to identify high- or low-use areas that may require more or less survey effort.

EPA.—The EPA initially explored using novel data when visitor surveys were delayed while under Office of Management and Budget review for the Paperwork Reduction Act (Public Law 104–13; 109 Stat. 163). Novel data allowed researchers to get information about visitors without directly interacting with them. Environmental Protection Agency researchers found the data useful for comparing on-the-ground counts with mobile data. Additionally, EPA offices tend to mostly use stated-preference surveys, and the revealed-preference methods introduced by novel data were a useful addition and complement to stated-preference methods.

Examples of Success and Need for Additional Examples (Breakout Session 1)

Questions guiding this discussion were as follows: What are the shining examples of success using novel data for recreation monitoring in recent years (in other words, 2019–23)? Where would we like to see more examples? The following are some of the responses discussed during this breakout session.

Shining examples

  • Studies using mixed methods to compare novel methods and onsite data

  • Working with partners to assess research needs and coproduce the research (including creating user-friendly end products such as dashboards)

  • A story map that the EPA created with the Narragansett Bay Estuary Program (refer to Twichell and others, 2021)

  • Using social media data to study how visitor-use patterns are affected by weather and climate change (Wilkins, Howe, and Smith, 2021; Wilkins, Chikamoto, and others, 2021)

  • Using improved visitation estimates to advocate for funding (for example, from State recreation offices) by showing trends in trail use through time

  • The University of Washington trailheads community science program that allows visitors to submit information about their visit by text messaging with a chatbot (Lia and others, 2023)

Need for additional examples

  • Recreation monitoring in dispersed areas

  • Recreation in marine environments

  • Recreation in urban areas

  • Opportunities to transfer tools and methods between disparate landscapes (for example, if and when tools and methods that work well in one location can be transferred to another location)

  • Studies evaluating methods for determining visitor origin and demographics

  • Studies evaluating methods for identifying visitor activities from novel data

  • Collaboration among multijurisdictional areas

  • Studies combining the use of multiple novel data sources

  • Real-time traffic-congestion applications

  • A needs analysis to identify applications for these tools and methods

  • Evaluating the value, outcomes, and metrics of success for these tools and data

Use-Case Presentations

Four presenters gave examples of case studies for evaluating the use and accuracy of novel data. Short summaries with key points from each presentation are as follows.

“Visitation Estimation with Cellular Device Location Datasets” (Nate Merrill with coauthors Wei-Lun Tsai, Anne Neale, Madeline Grupper, Kate Mulvaney, Marisa Mazzotta, and Justin Bousquin).—Nate Merrill (EPA) presented on using human mobility data to predict on-the-ground visitation counts. Specifically, the authors compared mobile device data from AirSage to monthly visitation counts at 38 NPS units. They ran a panel regression with random effects, stratifying by distance to a population center, level of recognition among nonlocal communities (in other words, if a park was considered iconic), park size, and porousness. The authors found generally high coefficient of determination values for the regressions predicting monthly visitation counts using mobile device data. In particular, mobile device location data were a good predictor of visitation for nonurban and iconic parks, large parks, and parks with low porousness. This indicates that for some parks, mobile data could serve as a complement to on-the-ground visitation estimates and may also be useful to show relative changes. However, the authors note that this approach still relies on on-the-ground methods and that cell data providers, inputs, and data processing are continually changing, requiring researchers to frequently update and adapt their methods. Merrill is working on a study to assess the accuracy of origin information from mobile device data. For more information on this research, refer to Tsai and others (2023); for previous related work, refer to Merrill and others (2020).

“Modeling Recreation on Public Lands Using Social Media and Mobile Data” (Sama Winder with coauthors Emmi Lia and Spencer Wood).—Sama Winder (University of Washington) presented a study using multiple novel data sources to estimate visitation on public lands, including trails in Washington and New Mexico and national wildlife refuges. For estimating visitation on trails in Washington and New Mexico, the authors used social media data from Flickr, Twitter, and Instagram. They also included baseline data of holidays, weather, and week of the year in the models. Using these factors, the authors could predict weekly visitors with an adjusted coefficient of determination of 0.74 in Washington. When applying the model to New Mexico, it still worked well but worked best if it could be calibrated with local data. The authors are currently (as of February 2023) working on applying the same methods to estimate visitation at 11 national wildlife refuges but also including AllTrails reports, eBird checklists, and mobile data from AirSage and Spectus to help predict visitation. Initial analyses show that the accuracy of novel datasets varies by data source, by location, and through time. However, combining these data from multiple sources, including social media and mobile device data, in a single model shows promise for predicting monthly visitors to national wildlife refuges. This work highlights the importance of calibrating novel datasets with trustworthy onsite data, combining multiple novel datasets, testing local model performance, and periodically recalibrating models as datasets change through time. The work on national wildlife refuges is still preliminary, but for more information on the models in Washington and New Mexico, refer to Wood and others (2020).

“Location-based Services Data” (Matt Brownlee).—Matt Brownlee (Clemson University) joined remotely to present three case studies using location-based services (LBS) data. The first study focused on spatial and temporal distributions of visitor use in Theodore Roosevelt National Park in North Dakota, the second focused on visitor demographics in Joshua Tree National Park in California, and the third investigated visitors’ dwell time and stops in the North Inlet-Winyah Bay National Estuarine Research Reserve in South Carolina. For the first study on spatial and temporal distributions, Brownlee compared information from global positioning system data loggers to LBS data. The spatial and temporal distributions are highly correlated in these two datasets but do contain some statistical differences. For the second study, when comparing demographics from LBS data to surveys, there was a positive association between age, education, and income categories but a statistically significant difference in averages. For instance, surveys indicated that park visitors were more educated and had higher incomes relative to estimates from LBS data. In study three, Brownlee found that LBS data were useful for determining stops and dwell time. This work is not published as of 2023 but may be published in the future.

“Regional Recreational Transportation Analysis Projects” (Rachel Collins with coauthor Erica Cole).—Rachel Collins (NPS) joined remotely to discuss using mobile device data for planning purposes in NPS contexts. The authors found that mobile device data were useful for a regional transportation system usage analysis for national parks in Colorado, using LBS data from 2018 to 2019. From the mobile device data, they obtained metrics such as weekday compared with weekend visitation, seasonality of use, proportions of resident compared with nonresident visitors, and travel speed. In Mount Rainier National Park, they completed a regional transportation-system usage analysis for 2020–22. Location-based services data confirmed seasonal visitation trends at the park. Overall, LBS data closely match count data from park entrance locations, and they also match home-location data from surveys fairly well. For related work, refer to Baird and others (2022).

Day 1 Wrap-Up

The first day ended with a whole-group discussion inviting questions that had not yet been discussed. Two main questions emerged: Can we use novel data to create forecasting models to predict future visitation? Could these data be used for real-time estimates? For the first question, the group consensus was that forecasting in general is difficult, and there is uncertainty in forecasts compounded by reliance on other forecasts (for example, population growth). Including novel data sources in forecasting models could help improve the accuracy. Participants from the FS and the University of Washington noted that they have reports or papers in press that forecast visitation and recreational activity participation (two have now been published: Goebel and others, 2023 and U.S. Department of Agriculture Forest Service, 2023). For the second question, the group discussed how real-time data differ from visitation data, including which questions these data are used to answer (for example, real-time data are often used to monitor sites for crowding). Obtaining and processing novel data to provide real-time estimates of recreation use (at individual sites or for administrative units) is challenging and likely not currently feasible given the time lags in data presently commercially available (as of this workshop in February 2023) and Federal agency computing realities. Additionally, workshop participants were skeptical that real-time estimates, even if available, would be useful in managing crowding or reducing visitor conflict.

Start of Day 2—Concerns with Using Novel Data (Breakout Session 2)

The second day began with a discussion about concerns with using novel data. Questions guiding this discussion were as follows: What are the main concerns with current applications, current research, and future directions for the use of social media and mobile device data, or other novel data for recreation monitoring (as of the time of this workshop in February 2023)? Topics mentioned during this breakout session are detailed as follows.

Common concerns mentioned in multiple groups

  • Consistency and stability through time of data sources and the underlying methods used by private companies (such as social media platforms and mobile data vendors)

  • Methods for how these data are processed by the data providers are often not transparent

  • Concerns about using these data for estimating visitor demographics due to accuracy and ethical concerns

  • Privacy concerns and the issues with government use of data that the private sector collects

Concerns most related to applications using novel data

  • Transferability of pilot-test results between locations

  • Natural resource managers may have unrealistic expectations of these data, perhaps because of data companies overstating the potential utility and accuracy of the data

  • Managers may overlook (or data companies may underemphasize) the need to continue to test and calibrate novel data with onsite data

  • Balancing careful research and validation with the need for timely information; pressure for quick answers may result in using novel data without prior validation of the data

  • The data may not be representative and may vary for different types of questions and applications

Concerns most related to research using novel data

  • There is a difference between data-driven and hypothesis-driven research (inductive compared with deductive approaches), and the best practices for each may be different (for more discussion on this, refer to Dagan and Wilkins, 2023)

  • Confidence intervals and levels of uncertainty are often unknown for novel data sources and difficult to compare to traditional sources

  • Data collection methods, assumptions, and accuracies likely differ across data providers

Panel on Mobile Device Data

A panel of four people spoke about their experiences working with mobile device data from an agency perspective. The panel consisted of Nate Merrill (EPA), Sarah Cline (FS), Rachel Collins (NPS), and David Pettebone (NPS). Other workshop participants who had experience using mobile device data also shared input. This section consists of a summary of some of the topics discussed and insights gained from the panelists and other participants with experience using these data.

The panelists noted that there are many companies that sell processed mobile device data (for example, AirSage, StreetLight, and Unacast) and that they all are slightly different. The NPS has used a contracting approach where the contractor decides how to purchase and process the data, so the NPS does not own the data. One of the key questions to consider before buying mobile device data is whether to purchase raw data that need processing to be useful or data that have already been processed. Purchasing raw data can be useful because then the researcher will know how the data were processed to generate final visitation estimates, but raw data also present a privacy concern and require someone who is skilled in analyzing this type of data. The NPS also has purchased connected-vehicle data (from Otonomo), which they note have the option to “chain” or combine trips made during a 24-hour period, rather than have trips defined by when a vehicle is turned on and off.

The cost to purchase mobile device data varies based on the vendor. Vendors generally offer data as either subscriptions or as downloads for certain places and times (by a bank of credits). One panelist noted that most purchase agreements from companies selling mobile device data prohibit the sharing of these data, which can be challenging because Federal agencies are required to share research data.

Two significant concerns regarding mobile device data were (1) the methods for processing the data are often a “black box” and (2) the data are often presented by vendors as being off-the-shelf-ready, when in reality, the data often need to be calibrated and validated to ensure quality. The methods being a “black box” are concerning because agency researchers must be able to understand and describe all data processing. Additionally, changes to the methods through time may affect the reliability of these data for visitor monitoring across periods. Panelists also noted that vendors may market mobile data to people who do not have prior experience using this type of data source, and companies sometimes may present this data source as a replacement that is cheaper and easier to use than onsite data sources (for example, traffic counters, trail counters, and surveys). Mobile data currently (as of February 2023) cannot replace traditional data sources on public lands and waters and should be viewed as a complementary data source rather than a replacement. Further, on-the-ground counts are still needed to calibrate and validate mobile device data.

Working Session for a Best Practices Document

Questions guiding this discussion were as follows: Who is the audience for a best practices document? What is the content? This working session began with a review of a document produced from the 2019 workshop: a four-page report that focused on the uses, the limitations, and a case study for using social media for visitor monitoring. Workshop participants agreed that an updated overview document would be very useful, following a similar format (either two or four pages). The new document should address the use of social media and mobile device data. Important points to communicate are that novel data are a complement—not a replacement—to traditional data, and the continued exploration of integrating novel data is important for a robust visitor monitoring program.

Research Agenda and Future Directions (Breakout Session 3)

Questions guiding this discussion were as follows: What are the research needs to inform practice? What new projects could be rolled out now (if funding were unlimited)? An overview of topics discussed during this breakout session is presented as follows.

Topics mentioned in multiple groups

  • The field would benefit from more validation work, especially in areas where onsite methods are less feasible (for example, dispersed use or porous areas).

  • The field would benefit from more case studies chosen purposefully to represent a suite of different geographies and site types (with a goal of generating enough research for a meta-analysis).

  • Researchers and field staff would benefit from more research to advance our level of understanding of novel approaches to match our level of understanding for onsite methods.

Topics mentioned once

  • Researchers could explore how to use novel data to make existing methods more robust.

  • Researchers should determine when we can use novel data without onsite counts for calibration and validation and if there is a difference when using these data for scientific research compared with applied management questions.

  • Researchers should assess the accuracy of the onsite data used to ground-truth novel data.

  • The field would benefit from more research into if and how novel data can be used to estimate demographics.

  • The field would benefit from more research into using novel data to forecast visitation.

Workshop Outcomes and Survey Results

Workshop participants agreed that it would be beneficial to start a community of practice where participants and others interested in this line of work can share research and ongoing studies related to using novel data for recreation monitoring. Some workshop participants volunteered to help start this group, and the DOI now hosts a SharePoint site and associated Microsoft Teams channel that is open for anyone to join, ask questions, and share resources related to these topics. The group also plans to hold recurring meetings with updates and research presentations. Participants noted that continuing to meet in person every 2 to 3 years would be valuable.

The workshop planning committee circulated an anonymous followup survey for Federal participants to share their thoughts about the workshop. Of 19 attendees (not including the 5 workshop organizers), we received 11 responses between February 18 and March 3, 2023 (response rate of 58 percent). All 11 respondents attended in person for day 1, and 10 of these respondents attended in person for day 2.

For our question about overall satisfaction with the workshop, nine respondents chose “very satisfied,” one respondent chose “somewhat satisfied,” no respondents chose “somewhat dissatisfied,” and one respondent chose “very dissatisfied” (the respondent who chose “very dissatisfied” indicated they were “very satisfied” or “somewhat satisfied” with every individual session of the workshop, so it is possible this was a mistake due to the ordering of the choices). For the individual sessions, the satisfaction level of participants was as follows:

  • For the day 1 featured presentation on “State of the Science,” all 11 respondents chose “very satisfied.”

  • For the day 1 session on use-cases, all 11 respondents chose “very satisfied.”

  • For the intro and wrap-up sessions, 10 respondents chose “very satisfied,” and one chose “somewhat satisfied.”

  • For the day 1 “lightning-round” talks on “State of the Applications,” nine respondents chose “very satisfied,” and two chose “somewhat satisfied.”

  • For the day 2 panel on mobile device data, eight respondents chose “very satisfied,” and three chose “somewhat satisfied.”

  • Other sessions received more mixed ratings, though most respondents still chose “very satisfied.”

For our question on new plans for projects as a result of the meeting, we received several positive responses. Some examples are as follows:

  • “Yes, I'm [sic] working to collaborate with a different agency.”

  • “Yes [sic] many. Having so many people that are focused on this topic together was excellent. It was fantastic to coordinate at the do-er [sic] level between agencies on this topic.”

  • “Our group hopes to continue to explore the potential for HMD [Human Mobility Data] and novel datasets for various human use projects.”

  • “Pursuing a project proposal currently that was entirely the result of this meeting. Great opportunity to reconnect with colleagues and get caught up on the current work and potential future directions.”

We also received several less definite responses, including the following:

  • “Not yet, but we are considering some of these options in the future, so I found the information presented very useful for our future planning.”

  • “I have discussed new ideas with new potential partners, but it’s [sic] too soon to tell if we will solidify plans.”

We asked respondents to rank five methods in terms of the potential to add value to their work. The results were as follows (in order from highest potential to add value to lowest potential to add value):

  1. 1. Using location-based services (LBS) data for counting visitors, looking at travel patterns, and so forth

  2. 2. Using actively solicited crowdsourced data (apps where users can volunteer information, short message service surveys, and so forth) for counting visitors; assessing user experience, demographics, and activities; and so forth

  3. 3. Using other user-generated data (not actively solicited) for counting visitors; assessing user experience, demographics, and activities; and so forth

  4. 4. Using social media image content to assess user experience, demographics, activities, and so forth

  5. 5. Using social media for counting visitors

2019 Novel Data in Recreation Monitoring Workshop Summary

The DOI’s PPA, the FS Pacific Northwest Research Station, the NPS Conservation and Outdoor Recreation Division, and the University of Washington Outdoor Recreation and Data Lab hosted a meeting April 3–4, 2019, at the USGS Fort Collins Science Center in Colorado to discuss the use of social media and crowdsourced data for understanding the amount and character of recreation use on public lands. The meeting was part of a PPA-funded effort to advance Federal agencies’ visitor estimation by researching the use of social media data from public lands. This project follows an effort started in 2015 under Service First funding. The common objective of these efforts was to support the development of improved and more consistent visitation estimates and data on visitor characteristics across agencies.

An invited group of 33 participants representing Federal agency managers, Federal and academic researchers, nonprofit organizations engaged in visitor research, and private sector technical service providers attended the meeting. Participating Federal agencies consisted of the U.S. Department of Agriculture’s FS, Department of Commerce’s NOAA, Department of the Army’s U.S. Army Corps of Engineers (USACE), DOI’s PPA, and five DOI Bureaus: the BLM, Bureau of Reclamation, NPS, FWS, and USGS. A list of participants can be found in appendix 4.

The meeting consisted of a mix of technical presentations and small group discussions. The morning of day 1 consisted of a series of presentations on the current (as of April 2019) technical capabilities of analysis based on novel data—particularly crowdsourced data from visitors’ social media. The afternoon consisted of small group discussions on promising applications, concerns, and limitations. Participants formed small groups to discuss their reactions to the technical presentations and the upsides and complications of using crowdsourced data. Day 2 began with an open discussion of day 1 and objectives for day 2. Several Federal agency representatives gave updates on agency recreation monitoring programs and communicated an interest in incorporating new data sources into existing approaches as an enhancement, rather than as a replacement or alternative. Participants then broke into small groups to discuss opportunities for integrating new and emerging data sources and methods with traditional approaches. In the afternoon of day 2, participants broke into research community and management community groups to identify needs for expanded use of social media and crowdsourced data and to identify immediate next steps. Summaries of each section are detailed as follows. The full agenda for this workshop can be found in appendix 5.

Current State of Knowledge and Practice in Using Social Media as Recreation Data

Spencer Wood (University of Washington) gave an introductory presentation that provided an overview of the research using social media to quantify and characterize visitation. Key points from this presentation are as follows.

Estimating visitation.—The presentation summarized the results of a growing body of academic research that shows a correlation between the number of posts on social media platforms and onsite visitation as measured and reported through traditional methods. The presentation discussed the platforms and content types used in this type of analysis, primarily georeferenced photographs on Flickr and Instagram, georeferenced tweets on Twitter, and location-specific posts on participatory trip forums such as Washington Trails Association (wta.org), a local example in Washington.

Technical methods include using social media posts to estimate user-days, which is the unique number of people posting in each area on a given day (Wood and others, 2013). Many of these studies, in diverse locations and settings around the world, have a strong correlation—around 70 percent—between the number of photographs posted online and visitor counts from conventional methods (Wood and others, 2013). These studies typically use a large geographic scale, such as an entire national park, and a long temporal scale, such as an entire year. Additional work is needed for understanding the functional limits of the methods at finer spatial and temporal scales. In addition, correlations are often strongest at popular sites and weaker in locations with fewer visitors or less social media data to analyze. An example of using social media to estimate visitation is a study from western Washington, where researchers have been working in the Mount Baker-Snoqualmie National Forest to collect onsite and social media data to create models of recreation visitation that incorporate a variety of parameters (Fisher and others, 2018). These approaches relied on several social media platforms to obtain the most robust characterization of visitation. This research has since been published; refer to Wood and others (2020).

Visitor locations and demographics.—The home location of visitors who post social media can be inferred using information contained in the user profiles or by analyzing the geographic patterns of their posts. Examples were shared from national parks that were able to provide an estimate of visitors’ county of origin that corresponded to on-the-ground estimates produced by the NPS with some important differences (refer to Sessions and others, 2016).

Visitor preferences.—Social media may help researchers investigate visitor preferences. Using an econometric approach that analyzed revealed preferences, the data can illustrate the place-dependent nature of visitor preferences. For example, a study in Minnesota and Iowa found that recreationists prefer lakes with higher water clarity based on Flickr “photo-user-days” and travel distances of lake users across the two States (refer to Keeler and others, 2015).

Data access and availability.—Posts are often accessed by means of an application programming interface provided by the relevant social media platform. The access policies and format of the data vary across platforms. For example, Flickr, a photograph-sharing platform, makes available the specific latitude and longitude of every photograph. Instagram, in contrast, allows people who share images to tag the images by selecting from a prepopulated set of geographic locations that may not correspond to official park boundaries, although the platform did previously show the precise coordinates of every image. This change in Instagram’s user interface illustrates that platform policies and policy changes have implications for the collection and analysis of social media data. Additionally, the popularity of both platforms has changed through time. Flickr use has declined since around 2016, whereas Instagram use has increased since then.

Content analysis.—Content analysis is usually performed by analyzing the text or images in social media posts. A relevant example for outdoor recreation is using photographs or text to determine visitor activity participation. For example, some researchers have examined user posts to Tripadvisor to understand participation in various coral reef-based activities (refer to Bartelet and others, 2022).

State-of-the-Art Applied Research and Applications

There were five presentations on applications of crowdsourced data for recreation monitoring. Details about the presentations are as follows.

“Using Social Media Data to Model Visitation” (Emmi Lia and Sama Winder)

Emmi Lia and Sama Winder (University of Washington) discussed efforts to create statistical models of visitation on public lands using social media posts as data. At the time of this presentation in 2019, the effort was still ongoing, but this research has since been published; for more information, refer to Wood and others (2020).

For their estimates of visitation, the team used onsite counts from 26 sites in western Washington gathered through 3 years and 13 sites in northern New Mexico gathered through 1 year and related these to social media user-days (from Instagram, Flickr, and Twitter) and baseline conditions (weather, holidays, and week of the year) using a linear model.

The study found that models of this form were able to estimate weekly visitation at an individual trail in the Mount Baker-Snoqualmie National Forest in western Washington. To test the generalizability of the model, the study was extended to include northern New Mexico. This region is very different from western Washington in terms of geography and local populations. However, the team found that roughly the same proportion of visitors to public lands in each region posted to social media (approximately 3 percent posted to Instagram and less than 0.05 percent posted to Flickr and Twitter).

A primary question was if the model parameterized in western Washington could effectively estimate visitation in northern New Mexico without incorporating any onsite New Mexico data. To test this question, the study estimated weekly visitation at defined New Mexico sites and compared the estimates to actual onsite counts gathered in 2018. They found that the model estimates were 60 percent correlated with the onsite counts. However, the model consistently overestimated use at low-visitation sites and underestimated use at high-visitation sites. To address this issue, the researchers reparametrized the model using a random subset of 80 week-site combinations of onsite data collected in northern New Mexico. They also added a random site-level effect to allow the model to create site-specific estimates. With these changes, they found that the new model produced estimates of visitation that were 95 percent correlated with observed onsite visitation. This new model no longer consistently overestimated or underestimated visitation.

Finally, the team presented results on the importance of social media in their models. To test this, they built models similar to those mentioned in the previous paragraph but related visitation to baseline conditions alone without any social media. They found that social media was very important for the models that estimated visitation in New Mexico without using any New Mexico data (raising the correlation between estimated and observed visitation from 0.20 to 0.62), likely because in this location, social media could act as a proxy for site popularity. However, social media’s role in the second set of models (which included some New Mexico data and a site-level random effect) was more nuanced. The baseline and the social media model produced estimates that were 95 percent correlated with observed visitation.

The team concluded that social media data can help aid our understanding of visitation patterns and are a useful addition to traditional methods of estimating visitation, especially when no onsite data are available. In particular, the availability of social media at fine spatial and temporal scales allowed the research team to capture unusual visitation patterns.

“Leveraging Crowdsourced Data to Understand Climate Change Impacts on Visitors and Wildflowers in the Western USA” (Ian Breckheimer)

Ian Breckheimer (Harvard University) focused on understanding the patterns of visitor use and their interactions with the natural environment based on an analysis of georeferenced photographs. The study concluded that data shared on social media can be a proxy for timing data on ecological occurrences such as wildflower blooms. The study also described the interaction between snowpack, wildflowers, and visitation in the subalpine meadows of Mount Rainier National Park.

The primary method of the study was an analysis of more than 17,000 geotagged photographs on Flickr. The study authors used machine learning analysis (after an initial researcher categorization) to identify the species of popular wildflowers in photographs shared by visitors. This analysis was successful in describing the timing and spatial location of wildflower blooms. This analysis was complicated by clustering (spatial and temporal), overrepresentation by “super-observers,” and observers’ imperfect detection of wildflowers. The geographic distribution of wildflower bloom as determined by social media data analysis was closely aligned with professional observation in the park.

For more information on approaches to measuring wildflower phenology, refer to Wilson and others (2017). This research has since been published; for more information, refer to Breckheimer and others (2020).

“Understanding Foraging Using Social Media Data” (Sonya Sachdeva)

Sonya Sachdeva (FS) presented a text-based analysis of Twitter posts (tweets) referencing foraging for wild foods. Any geotagged tweets from 2017 to 2018 that contained the words “forage,” “foraging,” “forager,” “foraged,” or “wildfood” were included. The study method used topic modeling to look for clusters of words that commonly occur together, using an unsupervised machine learning algorithm. The algorithm identified 30 topic clusters, some of which were very directly related to wildfood foraging. Others were less relevant, such as clusters involving foraging animals (bears, cows, and so forth) and one “survivalist” cluster. Clusters were mapped geographically to determine if there were any obvious patterns, but generally, they seemed to be tied to density of urban development. However, the survivalist cluster did show some hot spots in certain parts of the country.

This study showed how Twitter can be a source of information about activities that could be useful to managers who want to understand where hot spots of that activity occur. Proposed next steps were to use social-network analysis to look at how foraging information spreads, identify unexpected topics, and identify hot spot locations. For related research on using media (in other words, articles and blog posts) and topic modeling to understand foraging, refer to Sachdeva and others (2018).

“Measuring Whitefish Trail Use” (Scott Story)

Scott Story (Headwaters Economics) worked with local community partners to understand trail use and economic effects on the Whitefish Trail in Montana by using a combination of traditional counting and social media analysis. The key questions of the study were as follows: How many people are using the trails? How much money are they spending in the local area?

The social media data source was the platform Strava, which offers a public-facing heatmap (updated monthly) and a product called Strava Metro, which provides the customer with anonymized user data. Strava was selected for its perceived popularity among mountain bikers (a key user group of the Whitefish Trail) and in-kind data donation of local Strava Metro data by the platform. By request, Strava broke the data into four categories (local-pedestrian, visitor-pedestrian, local-bicyclist, and visitor-bicyclist) linked to spatial data.

These data were combined with infrared trail counts and intercept surveys. The survey tool included a question—“Are you using an activity tracking device?”—that allowed comparison of self-reported participation to the share of observed users and Strava data points. For example, 15 to 40 percent of survey respondents claimed to be recording their trip with Strava, but only 6 percent of users recorded their trip based on a comparison to trail-counter results.

The reaction to the study by community partners was largely positive even if there was not full certainty about trail-user quantity estimates. The data highlighted several segments of the trail that were heavily used despite not having full legal status (so-called “social” trails). A user-friendly data visualization was developed for managers to visualize locations of social trails and sensitive wildlife habitat for prioritizing future projects.

An expansion of the project was proposed to cover the greater Yellowstone National Park area by combining trail counters and user-generated data. As of the 2019 workshop, Headwaters Economics planned to expand and refine these methods to include more rigor and partner contributions. For more information on this research, refer to Lawson (2018).

“OuterSpatial” (Nate Irwin)

Nate Irwin (Trailhead Labs) provided an overview of Trailhead Labs’ background and mission and the OuterSpatial application. Trailhead Labs is focused on improving park management, creating usable technology, and facilitating the flow of information.

OuterSpatial is a custom app platform designed and sold by Trailhead Labs for parks and open space. It provides interactive maps and real-time updates and alerts. Trailhead Labs has been using Strava as a spatial quality-assurance method to adjust the locations of trails based on visitor-generated tracks. The company is exploring functionality that would allow visitors to share information with visitors or site managers. Workshop participants noted that interactive platforms like OuterSpatial may be a good way to solicit crowdsourced information such as visitor satisfaction or issue reports (for example, for locations needing repairs).

Promising Applications, Concerns, and Limitations (Breakout Session 1)

During the final session of day 1, participants formed five small groups to discuss their reactions to the technical presentations and the upsides and complications of using crowdsourced data. The workshop organizers asked three questions related to the promise of these methods, opportunities, and concerns:

  1. 1. Where do you see the most promise for these methods?

  2. 2. What is one opportunity coming up that you know about or are involved in to add social media techniques (1) to address a management question, or (2) to a traditional effort?

  3. 3. What is your biggest concern with using social media data?

Responses to each of these questions are outlined as follows.

1. Where Do You See the Most Promise for These Methods?

The small group responses fell into three primary categories as follows.

Estimating visitation

  • As a visitor-counting methodology, especially for understanding relative visitation between areas

  • Quality assurance for traditional methods by having an independent data source

  • Improving understanding of visitor use at a finer spatial and temporal scale

  • Identifying “hot spots” by tracking relative visitation through time

  • Detecting visitation patterns in response to virtual events or management actions

  • Providing real-time information

  • Understanding the links between land use, ecosystem services, and visitation

Visitor and visit characteristics

  • Providing a window into visitor sentiment by gauging more “candid” reactions to events or management changes

  • Visitor home or origin

  • Text and photograph content analysis of activity types

Outreach

  • Using community science and user-contributed data through direct requests to the public for information

  • As an outreach tool for visitors to communicate agency messages and encourage “peer-policing” of recreation behavior

  • Using text-messaging chatbots as an alternative to traditional survey methods (later published by Lia and others, 2023)

  • Better information outreach using social media to make recreation opportunities more relevant to underserved communities

2. What Is One Opportunity Coming Up That You Know About or Are Involved In to Add Social Media Techniques (1) to Address a Management Question, or (2) to a Traditional Effort?

  • The National Marine Sanctuary Visitation Project—pairing social media analytics with “smart buoys,” satellites, and applications

  • Mount Baker-Snoqualmie National Forest interface with the NVUM program—pairing some social media data (home origin and so forth) with existing surveys

  • Identifying social trails using route-based platforms like Strava

  • Detecting park reentries

  • Using social media to augment public comments

  • Applying crowdsourced techniques in difficult-to-count places like marine areas, urban parks, and long-distance trails

  • Using social media analysis in the intervals between rounds of traditional count methods as an interim or higher frequency step

  • Identifying outlier responses to inform sampling decisions

  • Using these approaches to advance departmental or agency initiatives—such as Interior Secretarial Orders 3356, 3366, and 3370—or new legislation like the Natural Resource Management Act (Public Law 116–9; 133 Stat. 580; https://www.fs.usda.gov/science-technology/fire/technology/law)

  • Integration into agency procedures—manuals for reporting visitor statistics could incorporate crowdsourced methods

  • Assisting with the DOI-wide customer satisfaction survey

  • Monitoring trends and patterns (for example, Yosemite National Park wilderness character survey)

3. What Is Your Biggest Concern with Using Social Media Data?

Access to data and platforms

  • Changing use of social media platforms through time and the stability of data through time

  • Data privacy questions

  • Diminishing access to data (high sale price, restrictions to preserve privacy, or for other reasons)

  • Ethics issues

Accuracy

  • Calibration and validation

  • Defensibility

  • Social media data calibrated with census data, linked to onsite

  • Narrow piece of the population, layers of self-selection, may not include displaced users or nonusers

  • Population of social media users likely does not represent the population at large

  • Challenges of analyzing activities

  • Biases in social media user groups, platform-specific bias

Applications

  • Concern that this will be perceived as the “silver bullet” for visitor-use estimation

  • Managers may be disappointed by the current state of the practice if it cannot answer their questions

  • If used as an enforcement tool, it could cause data availability problems as visitors come to understand its use and reduce postings

  • Staff capacity to test and apply the methods

  • Large datasets mean many individual efforts may be duplicative—working collaboratively can help share successful methods

  • Federal approval requirements for agencies and partners (such as the Paperwork Reduction Act [Public Law 104–13; 109 Stat. 163])

  • Government entities may prefer to continue using traditional methods rather than testing and embracing innovations

Federal Agency Updates on Recreation Monitoring Programs

Agency recreation leads shared updates on agency recreation monitoring programs and focused on happenings since the 2017 workshop. The following are some key points mentioned by each agency representative during the 2019 workshop.

BLM.—The BLM uses the Recreation Management Information System, a field-driven visitor estimation reporting program. The focus (as of 2019) is having field staff provide a reporting plan and document methodology for generating visitation estimates. The BLM is working to update guidelines for estimation reporting and establish formal geospatial boundaries for reporting units. Estimating visitation in dispersed areas is a challenge that could possibly be improved with emerging methods.

Bureau of Reclamation.—The Bureau of Reclamation relies on managing partners to provide recreation-use estimates and moved from physical form-based systems to an online reporting system for visitation estimates. Local administrators and partners are not required to document their estimation methodology, but that may be a future step.

NOAA.—The NOAA is working to better estimate recreation at national marine sanctuaries. West Virginia University is leading pilot monitoring studies in two locations: Gray’s Reef National Marine Sanctuary (Georgia) and Florida Keys National Marine Sanctuary. They will be looking at a range of potential data sources including smart buoys, satellite data, and social media data. The goal is to develop a low-cost method of understanding visitor use to serve as a basis for understanding use at all 15 sanctuaries and 2 marine national monuments. Some of this research has since been published; refer to Burns and others (2020).

NPS.—The NPS has several efforts exploring the use of new tools and technology in visitor-use statistics. Some examples include the following: testing the use of Bluetooth and Wi-Fi in Zion National Park (Utah) and Yellowstone National Park to understand how visitors move through the park, testing the use of three-dimensional cameras at the Korean War Veterans Memorial to count visitors instead of manual onsite counts, and testing the use of LBS data in northern Arizona. The NPS is also launching a national socioeconomic monitoring program. This will be a survey effort (with sampling at 24 representative parks per year), and the results will also help refine vehicle multipliers for visitor estimation.

USACE.—The USACE completed a monitoring modernization project that began in 2013. They used survey data to develop new parameters for handling recreation traffic counts and converting those counts into visitation estimates. The USACE’s most challenging areas for estimating visitation are rural areas and areas that have a variety of access points.

FWS.—A variety of approaches are used to develop recreation-use estimates at refuges. In 2018, the FWS pilot-tested a new survey system that would sample all refuges every 5 years. They are also pilot-testing a visit estimation system with a focus on hunting and visits to urban refuges. The focus is on traditional approaches, but there is interest in emerging methods.

FS.—There were three changes to the NVUM program implemented in fiscal year 2020. These changes include switching from paper surveys to using tablets, slightly modifying some survey questions to reduce confusion, and no longer requiring 24-hour mechanical traffic counters during sampling dates.

USGS.—The USGS is initiating a project to estimate ecosystem services produced in the Nisqually River delta in western Washington, using social media to estimate the provision of recreation-related ecosystem services.

Opportunities for Integrating New and Emerging Data Sources (Breakout Session 2)

In late morning of day 2, participants broke into four small groups to discuss how the new methods discussed on day 1 could be used or integrated with existing approaches. The groups brainstormed three types of applications: estimating visitation, content analysis (including demographics, preferences, and values), and active crowdsourcing. The following list represents a set of possible studies, administrative actions, or uses of crowdsourced data as of April 2019.

Estimating Visitation

Estimating visitor reentry rates and multidestination itineraries.—For example, social media could be used to better estimate how people are using areas that incorporate a complex set of sites or units.

Comparing proven technology and new methods.—Deploying new technology in the field requires comparing estimates from the new technology to proven onsite counts. For example, the NPS is planning a comparison of onsite counts, Bluetooth counts, LBS counts, and potentially social media counts.

Estimating visitation in areas with multiple entry points.—Many parks and recreation areas have complex entry options, like Delaware Water Gap National Recreation Area with 87 entrances. Other examples of parks and recreation areas with multiple entry points where it is challenging to estimate visitation include the following: Pearl Harbor National Memorial (Hawai'i), National Mall and Memorial Parks (Washington, D.C.), and the Columbia River gorge (Oregon and Washington). Other challenging areas include national scenic trails, island parks, and river corridors.

Improving sampling techniques (for example, frame and stratification).—For example, the FS could use social media data to inform the estimated use levels that define the NVUM sampling frame. Social media could also be used to identify peak periods to target temporally for visitor surveys.

Measuring the effects of environmental phenomena on visitation.—For example, social media data could be used to measure temporal displacement in visitation during environmental phenomena like fires or floods.

Measuring the effects of management actions on visitation.—For example, social media data could be used to measure the response of visitation to timber harvest or wildland fire fuel management.

Understanding travel patterns and length of stay.—Data on travel patterns and length of stay have historically been collected using visitor surveys, and more research is needed to determine if these could be accurately estimated through novel data sources such as social media or mobile device data.

Better understanding of the challenges related to visitor monitoring in low-use areas.—One suggestion was to test social media models of visitation in low-visitation areas, for instance, dispersed BLM areas, wilderness, or refuges. Researchers could mix high-confidence and low-confidence areas for validation, potentially using social media as a “ground truth” to improve visitation estimates.

Improving visitor estimation in urban settings.—Urban areas may present unique challenges for estimating visitation because they are often diffused and high density with multiple access points and a mix of recreation visitors and nonrecreation visitors (for example, residents and commuters).

Content Analysis Including Demographics, Preferences, and Values

Integrating social media content analysis into public comment processes.—Public conversations on social media contain useful content that could be analyzed to better understand public opinion. The public may expect that their postings to social media are being heard by agency managers, even if the conversation is not in an agency space. For example, researchers could perform content analysis of social media as it relates to public comment processes (for instance, the National Environmental Policy Act [42 U.S.C. 4321 et seq. {1969}] or Environmental Assessments).

Improving visitor home-origin identification.—Location-based services data are tied to census blocks in the United States, whereas social media profiles may contain more (or less) detail on home location. More research is needed to determine the accuracy of home location information from novel data sources.

Combining novel data sources with existing survey methods.—Researchers could use visitor survey data and pair them with social media data on user experience and preferences to look at differences between people who visited the sites and those who did not.

Active Crowdsourcing

Understanding self-selection biases and how to mitigate them when needed.—Partnering with community-based organizations to target certain communities or respondents may be useful depending on the goals of the study or data collection.

Learning from existing crowdsourcing platforms.—For example, what can we learn or infer from existing crowdsourcing platforms, such as iNaturalist and eBird?

Establishing common terminology would be useful.—For example, what are useful terms to describe active crowdsourcing?

Creating a study design to better understand the use of chatbots.—Chatbots are a new technology that facilitate an active crowdsourcing method where visitors can text a phone number posted at a trailhead or parking area and answer questions related to visitor use and demographics by conversing with a programmed bot. More research is needed to understand the effectiveness of different chatbots and community science methods.

Using human ecology mapping techniques (in other words, public participatory geographic information system [GIS]).—Public participatory GIS may be a useful method to let people map their experiences on public lands across a location.

Research and Management Needs Going Forward (Breakout Session 3)

In early afternoon of day 2, participants broke into researcher community and management community groups to identify needs for expanded use of social media and crowdsourced data. Following is a summary of the content recorded by those groups on paper posters.

Management needs

  • Information to provide to leadership

    • o Messages should be unified, brief, and concise

    • o Describe the return on investment from using these data

    • o 1-page overview to describe social media applications: What is possible, what it requires, what it costs, problems it solves and does not solve, and possibly a specific request for funding

  • Testing the use of novel data in dispersed or difficult-to-measure areas

    • o Particularly BLM, FWS, and Bureau of Reclamation lands

  • Clarity and consistency on potential value added and return on investment across agencies from using emerging technologies in recreation monitoring

  • A coordinated clearinghouse for materials, studies, and so forth

  • Dedicated resources in agencies to better understand the use of novel data for recreation monitoring

Research needs

  • Data processing capacity and resources

  • Automated scraping and query systems for gathering data from websites

  • Geodata on recreation site locations and infrastructure

  • Information on the return on investment in data processing

    • o Budgeting, scoping, and potential benefits

    • o Guidelines on best practices (including where there is a return on investment)

  • Combine opportunities where there are strategic advantages (for example, by region or by site type)

  • Ethics and privacy guidelines

  • Sampling design and units of analysis

    • o Codeveloped with management questions

  • Knowledge on how to identify nonusers and substitutes

  • Visualization and communication

  • More peer reviewers who have the knowledge to review this type of work

Workshop Outcomes and Survey Results

After the 2019 workshop, the organizers distributed a survey for participants to give anonymous feedback. Of 29 attendees (not including the 4 organizers), 21 participants responded to the survey. All respondents expressed that they were either satisfied (6 out of 21) or very satisfied (15 out of 21) with the meeting, and no respondents expressed that they were neutral, dissatisfied, or very dissatisfied.

When asked if they made any new plans for projects with partners because of this meeting, 10 respondents indicated they had, and an additional 3 respondents said they were still following up with other participants and may have new projects and partners because of this meeting. When asked if they made any new plans for projects solely within their organization because of this meeting, seven respondents said they had, and another six indicated “potentially” or “not yet” but that there may be projects in the future stemming from this meeting.

The survey also asked participants which method they thought had the most potential for adding value to their work. Ten people indicated “location-based services data (for counting visitors, looking at travel patterns, etc. [sic]),” four people indicated “actively solicited crowd-sourced [sic] data (apps where users can volunteer information, Short Message Service [sic] (SMS) surveys, etc. [sic]),” three people indicated “social media for counting visitors,” one person indicated “social media image content (user experience, demographics, activities, etc. [sic]),” and three people responded “other.” The “other” responses consisted of “paired actively solicited and location-based,” “social media for validating sampling frameworks, especially in dispersed settings,” and “social media content analysis (text as well as images).”

All 21 respondents expressed that this community should meet again. Nine respondents thought the community should meet every 2 years, 11 respondents thought the community should meet every year, and 1 respondent thought there should be meetings more often than once a year. No respondents indicated that meetings should be less often than every 2 years. When asked if the next meeting should be held in Washington, D.C., Colorado, or the West Coast, preferences were somewhat split. There were eight votes for Colorado, six votes for the West Coast, and five votes for Washington, D.C. (and some respondents did not answer).

Conclusion

In the last decade, there has been an increase in the available methods and data sources that could potentially be used for visitor monitoring and management on public lands and waters. Traditional sources of data for visitor monitoring include onsite methods such as trail counters and visitor intercept surveys. However, the rise of social media and mobile applications may offer the possibility of using novel sources of information to inform recreation monitoring. Two workshops were held in 2019 and 2023 to discuss the use of novel data in recreation monitoring. The overall purpose of the workshops was to convene researchers and practitioners from multiple Federal agencies to discuss the uses and limitations of novel datasets, including social media, mobile device applications, and other online secondary sources (for example, reviews or trip reports) in recreation monitoring. The goals of the workshops were to share knowledge across agencies on the state of the science and applications for using novel data and to allow participants to collectively discuss best practices for using emerging datasets to understand recreation on public lands and waters. This report provides a summary of what was presented and discussed during the 2019 and 2023 workshops on novel data in recreation monitoring.

References Cited

Baird, T., Stinger, P., Cole, E., and Collins, R., 2022, Mobile device data for parks and public lands transportation planning—A framework for evaluation and applications: Transportation Research Record, v. 2676, no. 8, p. 490–500, accessed February 6, 2023, at https://doi.org/10.1177/03611981221083911.

Bartelet, H.A., Barnes, M.L., Zoeller, K.C., and Cumming, G.S., 2022, Social adaptation can reduce the strength of social–ecological feedbacks from ecosystem degradation: People and Nature, v. 4, no. 4, p. 856–865, accessed February 6, 2023, at https://doi.org/10.1002/pan3.10322.

Breckheimer, I.K., Theobald, E.J., Cristea, N.C., Wilson, A.K., Lundquist, J.D., Rochefort, R.M., and HilleRisLambers, J., 2020, Crowd‐sourced data reveal social–ecological mismatches in phenology driven by climate: Frontiers in Ecology and the Environment, v. 18, no. 2, p. 76–82, accessed February 6, 2023, at https://doi.org/10.1002/fee.2142.

Burns, R.C., Andrew, R.G., Allen, M.E., Schwarzmann, D., and Cardozo Moreira, J., 2020, Conceptualizing the national marine sanctuary visitor counting process for marine protected areas: Journal of Ecotourism, v. 19, no. 4, p. 362–372, accessed February 6, 2023, at https://doi.org/10.1080/14724049.2020.1746794.

Dagan, D.T., and Wilkins, E.J., 2023, What is “big data” and how should we use it? The role of large datasets, secondary data, and associated analysis techniques in outdoor recreation research: Journal of Outdoor Recreation and Tourism, v. 44, article 100668, accessed December 29, 2023, at https://doi.org/10.1016/j.jort.2023.100668.

Fisher, D.M., Wood, S.A., Roh, Y.H., and Kim, C.K., 2019, The geographic spread and preferences of tourists revealed by user-generated information on Jeju Island, South Korea: Land, v. 8, no. 5, article 73, accessed February 6, 2023, at https://doi.org/10.3390/land8050073.

Fisher, D.M., Wood, S.A., White, E.M., Blahna, D.J., Lange, S., Weinberg, A., Tomco, M., and Lia, E., 2018, Recreational use in dispersed public lands measured using social media data and on-site counts: Journal of Environmental Management, v. 222, p. 465–474, accessed February 6, 2023, at https://doi.org/10.1016/j.jenvman.2018.05.045.

Ghermandi, A., 2022, Geolocated social media data counts as a proxy for recreational visits in natural areas—A meta-analysis: Journal of Environmental Management, v. 317, article 115325, accessed February 6, 2023, at https://doi.org/10.1016/j.jenvman.2022.115325.

Goebel, R., Schmaltz, A., Brackett, B.A., Wood, S.A., and Noguchi, K., 2023, Modeling and forecasting percent changes in national park visitation using social media: Journal of Forecasting, v. 42, no. 6, p. 1502–1518, accessed September 18, 2023, at https://doi.org/10.1002/for.2965.

Hamstead, Z.A., Fisher, D., Ilieva, R.T., Wood, S.A., McPhearson, T., and Kremer, P., 2018, Geolocated social media as a rapid indicator of park visitation and equitable park access: Computers, Environment and Urban Systems, v. 72, p. 38–50, accessed February 6, 2023, at https://doi.org/10.1016/j.compenvurbsys.2018.01.007.

Heikinheimo, V., Di Minin, E., Tenkanen, H., Hausmann, A., Erkkonen, J., and Toivonen, T., 2017, User-generated geographic information for visitor monitoring in a national park—A comparison of social media data and visitor survey: ISPRS International Journal of Geo-Information, v. 6, no. 3, article 85, accessed February 6, 2023, at https://doi.org/10.3390/ijgi6030085.

Horsch, E., Leggett, C., Smith, C., and Unsworth, R., 2017, Estimating the economic benefits of recreational visitation to federally-managed lands: Industrial Economics, Inc., accessed February 6, 2023, at https://www.doi.gov/sites/doi.gov/files/uploads/final.task3_.report.2017.09.18_1.pdf.

Keeler, B.L., Wood, S.A., Polasky, S., Kling, C., Filstrup, C.T., and Downing, J.A., 2015, Recreational demand for clean water—Evidence from geotagged photographs by visitors to lakes: Frontiers in Ecology and the Environment, v. 13, no. 2, p. 76–81, accessed February 6, 2023, at https://doi.org/10.1890/140124.

Lawson, M., 2018, Measuring Whitefish Trail use: Headwaters Economics web page, accessed February 6, 2023, at https://headwaterseconomics.org/economic-development/trails-pathways/whitefish-trail-use/.

Leggett, C., Horsch, E., Smith, C., and Unsworth, R., 2017, Estimating recreational visitation to federally-managed lands: Industrial Economics, Inc., accessed February 6, 2023, at https://www.doi.gov/sites/doi.gov/files/uploads/final.task1_.report.2017.04.25.pdf.

Lia, E.H., Derrien, M.M., Winder, S.G., White, E.M., and Wood, S.A., 2023, A text-messaging chatbot to support outdoor recreation monitoring through community science: Digital Geography and Society, v. 5, article 100059, accessed September 18, 2023, at https://doi.org/10.1016/j.diggeo.2023.100059.

Liang, Y., Yin, J., Pan, B., Lin, M.S., Miller, L., Taff, B.D., and Chi, G., 2022, Assessing the validity of mobile device data for estimating visitor demographics and visitation patterns in Yellowstone National Park: Journal of Environmental Management, v. 317, article 115410, accessed February 6, 2023, at https://doi.org/10.1016/j.jenvman.2022.115410.

Mashhadi, A., Winder, S.G., Lia, E.H., and Wood, S.A., 2021, No walk in the park—The viability and fairness of social media analysis for parks and recreational policy making, in Proceedings of the Fifteenth International AAAI Conference on Web and Social Media [virtual conference], June 7–10, 2021: AAAI Press, v. 15, no. 1, p. 409–420, accessed February 6, 2023, at https://doi.org/10.1609/icwsm.v15i1.18071.

Merrill, N.H., Atkinson, S.F., Mulvaney, K.K., Mazzotta, M.J., and Bousquin, J., 2020, Using data derived from cellular phone locations to estimate visitation to natural areas—An application to water recreation in New England, USA: PLOS ONE, v. 15, no. 4, article e0231863, accessed February 6, 2023, at https://doi.org/10.1371/journal.pone.0231863.

Monz, C., Creany, N., Nesbitt, J., and Mitrovich, M., 2021, Mobile device data analysis to determine the demographics of park visitors: Journal of Park and Recreation Administration, v. 39, no. 1, p. 123–130. [Also available at https://doi.org/10.18666/10.18666/JPRA-2020-10541.]

Pickering, C., Walden-Schreiner, C., Barros, A., and Rossi, S.D., 2020, Using social media images and text to examine how tourists view and value the highest mountain in Australia: Journal of Outdoor Recreation and Tourism, v. 29, article 100252, accessed February 6, 2023, at https://doi.org/10.1016/j.jort.2019.100252.

Sachdeva, S., Emery, M.R., and Hurley, P.T., 2018, Depiction of wild food foraging practices in the media—Impact of the Great Recession: Society and Natural Resources, v. 31, no. 8, p. 977–993, accessed February 6, 2023, at https://doi.org/10.1080/08941920.2018.1450914.

Sessions, C., Wood, S.A., Rabotyagov, S., and Fisher, D.M., 2016, Measuring recreational visitation at US national parks with crowd-sourced photographs: Journal of Environmental Management, v. 183, p. 703–711, accessed February 6, 2023, at https://doi.org/10.1016/j.jenvman.2016.09.018.

Sinclair, M., Mayer, M., Woltering, M., and Ghermandi, A., 2020, Using social media to estimate visitor provenance and patterns of recreation in Germany’s national parks: Journal of Environmental Management, v. 263, article 110418, 12 p., accessed February 6, 2023, at https://doi.org/10.1016/j.jenvman.2020.110418.

Sonter, L.J., Watson, K.B., Wood, S.A., and Ricketts, T.H., 2016, Spatial and temporal dynamics and value of nature-based recreation, estimated via social media: PLOS ONE, v. 11, no. 9, article e0162372, accessed February 6, 2023, at https://doi.org/10.1371/journal.pone.0162372.

Tenkanen, H., Di Minin, E., Heikinheimo, V., Hausmann, A., Herbst, M., Kajala, L., and Toivonen, T., 2017, Instagram, Flickr, or Twitter—Assessing the usability of social media data for visitor monitoring in protected areas: Scientific Reports, v. 7, no. 1, article 17615, accessed February 6, 2023, at https://doi.org/10.1038/s41598-017-18007-4.

Tsai, W.-L., Merrill, N.H., Neale, A.C., and Grupper, M., 2023, Using cellular device location data to estimate visitation to public lands—Comparing device location data to U.S. National Park Service’s visitor use statistics: PLOS ONE, v. 18, no. 11, article e0289922, accessed December 28, 2023, at https://doi.org/10.1371/journal.pone.0289922.

Twichell, J., Merrill, N., Mulvaney, K., and Altamirano, K., 2021, How do we use our coasts?: Narragansett Bay Estuary Program web page, accessed February 6, 2023, at https://storymaps.arcgis.com/stories/b994fadc18bb4f1bb82dea62956c3139.

U.S. Department of Agriculture Forest Service, 2023, Future of America’s forests and rangelands—Forest Service 2020 resources planning act assessment: Washington, D.C., U.S. Department of Agriculture Forest Service, General Technical Report WO-102, 348 p., accessed February 6, 2023, at https://doi.org/10.2737/WO-GTR-102.

Wilkins, E.J., Chikamoto, Y., Miller, A.B., and Smith, J.W., 2021, Climate change and the demand for recreational ecosystem services on public lands in the continental United States: Global Environmental Change, v. 70, article 102365, accessed February 6, 2023, at https://doi.org/10.1016/j.gloenvcha.2021.102365.

Wilkins, E.J., Howe, P.D., and Smith, J.W., 2021, Social media reveal ecoregional variation in how weather influences visitor behavior in US National Park Service units: Scientific Reports, v. 11, no. 1, article 2403, 12 p., accessed February 6, 2023, at https://doi.org/10.1038/s41598-021-82145-z.

Wilkins, E.J., Van Berkel, D., Zhang, H., Dorning, M.A., Beck, S.M., and Smith, J.W., 2022, Promises and pitfalls of using computer vision to make inferences about landscape preferences—Evidence from an urban-proximate park system: Landscape and Urban Planning, v. 219, article 104315, accessed February 6, 2023, at https://doi.org/10.1016/j.landurbplan.2021.104315.

Wilkins, E.J., Wood, S.A., and Smith, J.W., 2021, Uses and limitations of social media to inform visitor use management in parks and protected areas—A systematic review: Environmental Management, v. 67, no. 1, p. 120–132, accessed February 6, 2023, at https://doi.org/10.1007/s00267-020-01373-7.

Wilson, A., Bacher, K., Breckheimer, I., Lundquist, J., Rochefort, R., Theobald, E., Whiteaker, L., and HilleRisLambers, J., 2017, Monitoring wildflower phenology using traditional science, citizen science, and crowdsourcing approaches: Park Science, v. 33, no. 1, p. 17–26.

Winder, S.G., Lee, H., Seo, B., Lia, E.H., and Wood, S.A., 2022, An open‐source image classifier for characterizing recreational activities across landscapes: People and Nature, v. 4, no. 5, p. 1249–1262, accessed February 6, 2023, at https://doi.org/10.1002/pan3.10382.

Wood, S.A., Guerry, A.D., Silver, J.M., and Lacayo, M., 2013, Using social media to quantify nature-based tourism and recreation: Scientific Reports, v. 3, no. 1, article 2976, accessed February 6, 2023, at https://doi.org/10.1038/srep02976.

Wood, S.A., Winder, S.G., Lia, E.H., White, E.M., Crowley, C.S., and Milnor, A.A., 2020, Next-generation visitation models using social media to estimate recreation on public lands: Scientific Reports, v. 10, no. 1, article 15419, accessed February 6, 2023, at https://doi.org/10.1038/s41598-020-70829-x.

Appendix 1. 2017 Interagency Workshop on Recreation Visitation Data

On March 20, 2017, there was an interagency workshop on recreation visitor data held at the U.S. Department of the Interior (DOI) building in Washington, D.C. The objectives of this workshop were to (1) review existing methods that Federal land and water management agencies use to collect recreation visitation data, (2) discuss how the data are used by individuals and organizations inside and outside of the Federal Government, (3) discuss opportunities for improving existing data collection methods and coordination across agencies, and (4) review alternative and emerging data collection methods.
This project was funded by the DOI Office of Policy Analysis (PPA) under the Service First authority, a grant program for interagency projects. The project included planning and hosting the 2017 workshop and developing two reports—one on methods used across agencies and another on the economic value and contributions of recreation. The report on methods described how Federal land and water managing agencies estimate and characterize visitation on their lands and focused on the National Park Service, Bureau of Land Management, U.S. Fish and Wildlife Service, U.S. Department of Agriculture Forest Service, Bureau of Reclamation, and U.S. Army Corps of Engineers (now published, refer to Leggett and others, 2017). This report also detailed recommendations for improving the collection, documentation, and distribution of visitation data, and new technologies for visitor counting that could be explored in the future. These topics were discussed during the 2017 workshop, and the report was in part based on the workshop discussions (Leggett and others, 2017). The report on economic valuation focused on how Federal agencies measure the value of recreation, including estimates of visitors’ willingness to pay and estimates of economic contributions of visitor expenditures related to recreation on Federal lands and waters in 2016 (now published, refer to Horsch and others, 2017). The content of the 2017 workshop discussions is summarized in the report on methods (Leggett and others, 2017).
The 2017 workshop was attended by Federal staff from the following agencies or programs: the National Park Service, Bureau of Land Management, U.S. Fish and Wildlife Service, U.S. Department of Agriculture Forest Service, Bureau of Reclamation, U.S. Army Corps of Engineers, DOI PPA, National Oceanic and Atmospheric Administration, U.S. Geological Survey, Department of Transportation, Recreation.gov, and Federal Lands Recreation Enhancement Act of 2004 (16 U.S.C. 6804 et seq.) Interagency Pass Program. Consultants to PPA from Industrial Economics, Inc., Bedrock Statistics, and Resource Systems Group also attended and helped facilitate the workshop.

References Cited

Horsch, E., Leggett, C., Smith, C., and Unsworth, R., 2017, Estimating the economic benefits of recreational visitation to federally-managed lands: Industrial Economics, Inc., accessed February 6, 2023, at https://www.doi.gov/sites/doi.gov/files/uploads/final.task3_.report.2017.09.18_1.pdf.

Leggett, C., Horsch, E., Smith, C., and Unsworth, R., 2017, Estimating recreational visitation to federally-managed lands: Industrial Economics, Inc., accessed February 6, 2023, at https://www.doi.gov/sites/doi.gov/files/uploads/final.task1_.report.2017.04.25.pdf.

Appendix 2. List of Participants, Novel Data in Recreation Monitoring Workshop, 2023

Twenty-four people participated in the February 2023 workshop in Fort Collins, Colorado (table 2.1). In-person participants were from eight Federal agencies and two universities. Sessions with presentations or panels had an option for additional attendees to join remotely. For this workshop, the organizers made the decision to only invite Federal agency representatives and university partners; researchers and consultants from private industry were not included in this workshop.

Table 2.1.    

Participants of the 2023 workshop on novel data in recreation monitoring (this list does not include additional participants who joined sessions remotely).

[Workshop organizers are indicated with *]

Name Agency or university
Karla Rogers Bureau of Land Management
Larry Ridenhour Bureau of Land Management
Rebecca Moore Bureau of Land Management
Christian Crowley* U.S. Department of the Interior Office of Policy Analysis
Nate Merrill Environmental Protection Agency
Chris Giguere National Oceanic and Atmospheric Administration
Danielle Schwarzmann National Oceanic and Atmospheric Administration
Seann Regan National Oceanic and Atmospheric Administration
Claire Spalding National Park Service
David Pettebone National Park Service
Pam Ziesler National Park Service
Rachel Collins National Park Service
Wylie Carr National Park Service
Natalie Sexton U.S. Fish and Wildlife Service
Eric White* U.S. Department of Agriculture Forest Service
Monika Derrien U.S. Department of Agriculture Forest Service
Sarah Cline U.S. Department of Agriculture Forest Service
Emily Wilkins* U.S. Geological Survey
Nick Cole U.S. Geological Survey
Rudy Schuster* U.S. Geological Survey
Ben Fowler Clemson University
Dani Dagan Clemson University
Sama Winder University of Washington
Spencer Wood* University of Washington
Table 2.1.    Participants of the 2023 workshop on novel data in recreation monitoring (this list does not include additional participants who joined sessions remotely).

Appendix 3. Agenda, Novel Data in Recreation Monitoring Workshop, 2023

This appendix contains the agenda for the 2023 workshop on novel data in recreation monitoring. This agenda was distributed to participants before the meeting.
Novel Data in Recreation Monitoring Workshop
February 1–2, 2023
U.S. Geological Survey Fort Collins Science Center
Fort Collins, Colorado
** Denotes sessions available to join remotely by Teams
Wednesday, February 1
8:00–8:30 a.m.: Sign in
8:30–9:00 a.m.: Introductions
9:00–9:15 a.m.: Review of the main takeaways from past meetings (in 2017 and 2019)
9:15–10:00 a.m.: **“State of the Science”—Featured presentation
10:00–10:30 a.m.: Break
10:30 a.m.–noon: “State of the Application”—“Lightning-round” talks
  • 10-minute presentations from each agency on current uses of novel data sources for visitor monitoring, or how novel data might help answer agency questions

Noon–1:30 p.m.: Lunch break
1:30–2:45 p.m.: Breakout session 1—Examples of success (and need for additional examples)
  • What are the shining examples of success (using novel data for recreation monitoring) in recent years (since the last workshop in 2019)? Where would we like to see more examples?

  • 15-minute breakout session; 1-hour whole-group discussion

2:45–3:00 p.m.: Break
3:00–4:15 p.m.: **Use-case presentations—Four use-cases that have evaluated novel data
4:15–4:45 p.m.: Whole-group discussion and day 1 wrap-up
6:30 p.m. onward: Dinner
Thursday, February 2
8:00–8:30 a.m.: Sign in
8:30–9:00 a.m.: Reflect on day 1; objectives for day 2
9:00–10:00 a.m.: Breakout session 2—Concerns with using novel data
  • What are the main concerns with current applications, current research, and future directions for the use of social media, mobile device data, or other novel data for recreation monitoring?

  • 15-minute breakout session; 45-minute whole-group discussion

10:00–10:30 a.m.: Break
10:30 a.m.–noon: **Panel on mobile device data—Issues and guidance on data acquisition and use
Noon–1:30 p.m.: Lunch break
1:30–2:45 p.m.: Best practices document—Whole-group working session
  • Updating the best practices document created in 2019 (for social media data sources) with recent findings and data sources

2:45–3:00 p.m.: Break
3:00–3:45 p.m.: Breakout session 3—Research agenda and future directions
  • What are the research needs to inform practice? What new projects could be rolled out now (if funding were unlimited)?

  • 15-minute breakout session; 30-minute whole-group discussion

3:45–4:15 p.m.: Closing thoughts—Whole-group discussion
4:15–4:45 p.m.: Next steps
6:30 p.m. onward: Dinner

Appendix 4. List of Participants, Novel Data in Recreation Monitoring Workshop, 2019

Thirty-three people participated in the April 2019 workshop in Fort Collins, Colorado (table 4.1). Participants represented Federal agencies, universities, private industry, and nonprofits. For this workshop, which was the first to focus specifically on novel data in recreation monitoring, the organizers decided to invite a diversity of researchers and practitioners working on this topic from a variety of sectors.

Table 4.1.    

Participants of the 2019 workshop on novel data in recreation monitoring.

[Workshop organizers are indicated with *]

Name Agency, university, or company
Larry Ridenhour Bureau of Land Management
Christian Crowley* U.S. Department of the Interior Office of Policy Analysis
Christopher Lauer National Oceanic and Atmospheric Administration
Adam Milnor* National Park Service
Bret Meldrum National Park Service
Pam Ziesler National Park Service
Rachel Collins National Park Service
Dena Williams U.S. Army Corps of Engineers
Jerome Jackson Bureau of Reclamation
Georgia Basso U.S. Fish and Wildlife Service
Natalie Sexton U.S. Fish and Wildlife Service
Don English U.S. Department of Agriculture Forest Service
Eric White* U.S. Department of Agriculture Forest Service
Monika Derrien U.S. Department of Agriculture Forest Service
Rebecca Rasch U.S. Department of Agriculture Forest Service
Sonya Sachdeva U.S. Department of Agriculture Forest Service
Cathy Thomas U.S. Geological Survey
Rudy Schuster U.S. Geological Survey
Travis Poitras U.S. Geological Survey
Ian Breckheimer Harvard University
Nathan Reigner University of Akureyri (Iceland)
Ryan Noe University of Minnesota
Emmi Lia University of Washington
Sama Winder University of Washington
Spencer Wood* University of Washington
Lisa Majewski University of Wuertenburg (Germany)
Mary Allen West Virginia University
Robert Burns West Virginia University
Jeremy Wimpey Applied Trails Research
Scott Story Headwaters Economics
Erik Murdock Outdoor Alliance
Levi Rose Outdoor Alliance
Nate Irwin Trailhead Labs
Table 4.1.    Participants of the 2019 workshop on novel data in recreation monitoring.

Appendix 5. Agenda, Novel Data in Recreation Monitoring Workshop, 2019

This appendix contains the agenda for the 2019 workshop on novel data in recreation monitoring. This agenda was distributed to participants before the meeting.
Novel Data in Recreation Monitoring Workshop
April 3–4, 2019
U.S. Geological Survey Fort Collins Science Center
Fort Collins, Colorado
Wednesday, April 3—Focus on the state of the knowledge, methods, and applications
8:00–8:30 a.m.: Welcome and sign in
8:30–9:15 a.m.: Introductions and meeting overview
  • 2017 workshop review and 2019 project and meeting objectives

9:15–10:00 a.m.: Opening discussion—Current state of knowledge and practice in using social media as recreation data
  • Spencer Wood, University of Washington

10:00–10:30 a.m.: Break
10:30 a.m.–noon: Presentations—State-of-the-art applied research and applications
  • Emmi Lia and Sama Winder, University of Washington

  • Ian Breckheimer, Harvard University

Noon–1:30 p.m.: Lunch (on your own)
1:30–3:00 p.m.: Presentations—State-of-the-art applied research and applications
  • Sonya Sachdeva, U.S. Department of Agriculture Forest Service Northern Research Station

  • Scott Story, Headwaters Economics

  • Nate Irwin, Trailhead Labs

3:00–3:30 p.m.: Break
3:30–4:45 p.m.: Small group discussions—Your reactions to the approaches and applications
4:45–5:00 p.m.: Closeout for day 1
5:30 p.m.: Optional hikes
7:00 p.m.: Optional dinners
Thursday, April 4—Focus on collaboration and moving forward
8:00–8:30 a.m.: Welcome and sign in
8:30–9:00 a.m.: Participant reflections on day 1; hopes for day 2
9:00–10:30 a.m.: Updates on Federal agency recreation monitoring programs
  • Bureau of Land Management

  • U.S. Fish and Wildlife Service

  • National Oceanic and Atmospheric Administration

  • National Park Service

  • Bureau of Reclamation

  • U.S. Army Corps of Engineers

  • U.S. Department of Agriculture Forest Service

  • U.S. Geological Survey

10:30–11:00 a.m.: Break
11:00 a.m.–12:30 p.m.: “Igniting the Science of Outdoor Recreation”—Identified strategies
  • Small group discussions—Opportunities for integrating new and emerging data sources and methods with traditional approaches

12:30–2:00 p.m.: Lunch (on your own)
2:00–2:30 p.m.: Group exploration—Social media scrape and data from the Front Range
2:30–3:15 p.m.: Small group discussion—Our research and management needs, and actions going forward
3:15–3:45 p.m.: Break
3:45–4:45 p.m.: Closing group discussion
  • Personal takeaways

  • Putting the pieces together—Thoughts and suggestions for advancing recreation monitoring

  • Feedback on interagency pilot research effort and next opportunities

  • Followup 2019 stakeholders meeting in Washington, D.C.

4:45–5:00 p.m.: Meeting closeout
5:30 p.m.: Optional hikes
7:00 p.m.: Optional dinners

Abbreviations

BLM

Bureau of Land Management

DOI

U.S. Department of the Interior

EPA

Environmental Protection Agency

FS

U.S. Department of Agriculture Forest Service

FWS

U.S. Fish and Wildlife Service

GIS

geographic information system

LBS

location-based services

NOAA

National Oceanic and Atmospheric Administration

NPS

National Park Service

NVUM

National Visitor Use Monitoring

PPA

Office of Policy Analysis

USACE

U.S. Army Corps of Engineers

USGS

U.S. Geological Survey

Publishing support provided by the Science Publishing Network,

Denver Publishing Service Center

For more information concerning the research in this report, contact the

Center Director, USGS Fort Collins Science Center

2150 Centre Ave., Bldg. C

Fort Collins, CO 80526-8118

(970) 226-9100

Or visit the Fort Collins Science Center website at:

https://www.usgs.gov/centers/fort

Disclaimers

Any use of trade, firm, or product names is for descriptive purposes only and does not imply endorsement by the U.S. Government.

Although this information product, for the most part, is in the public domain, it also may contain copyrighted materials as noted in the text. Permission to reproduce copyrighted items must be secured from the copyright owner.

Suggested Citation

Wilkins, E.J., Crowley, C.S.L., White, E.M., Wood, S.A., and Schuster, R., 2024, Novel data in recreation monitoring—Summary proceedings from interagency workshops in 2019 and 2023: U.S. Geological Survey Scientific Investigations Report 2024–5013, 24 p., https://doi.org/10.3133/sir20245013.

ISSN: 2328-0328 (online)

Publication type Report
Publication Subtype USGS Numbered Series
Title Novel data in recreation monitoring—Summary proceedings from interagency workshops in 2019 and 2023
Series title Scientific Investigations Report
Series number 2024-5013
DOI 10.3133/sir20245013
Year Published 2024
Language English
Publisher U.S. Geological Survey
Publisher location Reston VA
Contributing office(s) Fort Collins Science Center
Description vi, 24 p.
Online Only (Y/N) Y
Google Analytic Metrics Metrics page
Additional publication details