Authors: Yaping Cai*, University of Illinois, Shaowen Wang, University of Illinois, Kaiyu Guan, University of Illinois, Jian Peng, University of Illinois
Topics: Geographic Information Science and Systems, Remote Sensing, Land Use and Land Cover Change
Keywords: Crop type classification, workflow, high-performance, time-series data
Session Type: Paper
Start / End Time: 3:20 PM / 5:00 PM
Room: Astor Ballroom I, Astor, 2nd Floor
Presentation File: No File Uploaded
Accurate crop type classification based on remote sensing data is important for both scientific and practical purposes. The state-of-the-art research on crop-type classification has been shifted from relying on only spectral features of single static images to combining both spectral and time series information. However, the advanced crop type classification method based on both spectral and time series information through machine learning approaches (especially for artificial neural network method) is proved to be both data-intensive and computationally-intensive, which limits the method to apply to large geographic areas. Therefore, we investigated the computational bottlenecks of the current workflow of the advanced method and identified the computational bottlenecks in the procedures of data preprocessing and model building. Then, we proposed a new workflow to speed up data preprocessing by parallel processing each image to create key-value intermediate results and applying the map-reduce framework to generate samples based on the intermediate results, and by utilizing CPU for hyper-parameter selecting and GPU for the final model building. We conducted a case study in IL State. We collected Landsat Surface Reflectance Data covering IL State in 2016 as model input information and collected Cropland Data Layer in 2016 as ground truth. We applied the new workflow to test crop type classification efficiency while keeping the classification accuracy. The new workflow shows great potential for high-performance crop type classification for large geographic areas.