Segmentation and machine learning with hyper-spatial imagery: deforestation of the Maya Biosphere Reserve

Authors: Nate Currit*, Texas State University, Jennifer Devine, Texas State University, Yunuen Reygadas, Texas State University, Gabrielle Allen, Texas State University
Topics: Land Use and Land Cover Change, Remote Sensing, Coupled Human and Natural Systems
Keywords: deforestation, conservation, Guatemala, machine learning
Session Type: Paper
Day: 4/4/2019
Start / End Time: 5:00 PM / 6:40 PM
Room: Jackson, Marriott, Mezzanine Level
Presentation File: No File Uploaded


Deforestation rates in protected areas in Peten, Guatemala are among the highest rates globally. While debate exists in some policy circles about the relative contributions of farming and cattle ranching to deforestation, ethnographic research strongly suggests that narco-capitalized, illegal cattle ranching is the primary driver. Forthcoming research based on manual assessments of hyperspatial imagery suggests that relatively small percentages of land were dedicated to farming and large percentages of land were dedicated to cattle ranching in 2015 in Guatemalan protected areas. This research builds on the manual assessment to determine multi-year trends in the relative contributions of farming and ranching to deforestation. We apply Geographic Object-based Image Analysis (GEOBIA) and machine learning techniques to classify regions of farming and cattle ranching in 2011, 2012, 2013 and 2015. First, the Felzenszwalb segmentation algorithm is used to delineate geographic objects consisting of contiguous pixels with similar spectral responses. Second, land cover labels, including farming and cattle ranching, are assigned to each segment using a Support Vector Machine (SVM). The classification model is trained on data samples from the year 2015 and tested on separate samples from the same year. The final trained model is applied to all previous years and accuracy statistics are derived from a sampling of manually classified images from each year.

Abstract Information

This abstract is already part of a session. View the session here.

To access contact information login