Authors: Edwin Chow*, Texas State University
Topics: Geographic Information Science and Systems, Hazards, Risks, and Disasters, Spatial Analysis & Modeling
Keywords: flood depth, big data analytics, social media, machine learning, computer vision
Session Type: Paper
Presentation File: No File Uploaded
About 95 percent of geotagged tweets and crowdsourced data relevant to Hurricane Harvey were multimedia (i.e. images or videos), but the contextual and objective flood analytics (e.g. water depth, coverage, flood damage, etc.) embedded in the multimedia remains an untapped resource. The primary objective of this research is to develop an innovative framework to extract flood analytics from multiple sources of big data for smart emergency management. This research proposes to harvest flood observations from multisource data, including but not limited to social media, crowdsourced data, and surveillance cameras during Hurricane Harvey in 2017. Then we will develop novel algorithms in Computer Vision and Geographic Information Science to extract water depth from heterogeneous data of text, multimedia and geospatial inputs. These discrete flood observations will be validated against authoritative field data and integrated into a space-time database. By extracting water depth from heterogeneous data sources, this research examines any significant differences of derived water depth among text, multimedia and geospatial inputs. The flood database and novel algorithms derived from this study will be used to support mapping the flood stage and related flood analytics in near-real time.
To access contact information login