Measuring socio-economic indicators from street view imagery

Authors: Fan Zhang*, Senseable City Lab, Massachusetts Institute of Technology, Yuhao Kang, Geospatial Data Science Lab, Department of Geography, University of Wisconsin-Madison, Song Gao, Geospatial Data Science Lab, Department of Geography, University of Wisconsin-Madison, Carlo Ratti, Senseable City Lab, Massachusetts Institute of Technology
Topics: Geographic Information Science and Systems, Urban Geography
Keywords: Urban studies, data mining, street view imagery, deep learning
Session Type: Paper
Presentation File: No File Uploaded


The reciprocal interactions between urban physical environments and socio-economic environments have long been of interest to a wide variety of fields. However, quantifying their relationships has been limited by the lack of efficient computational tools and appropriate representations of urban physical environments.
In this work, we employ street view imagery as the proxy of urban physical environments, and present how a deep convolutional neural network (DCNN) can be trained to measure neighborhood-level socio-economic indicators (population, income, ethnicity, unemployment) and human activity patterns (hourly variation of visitor numbers) from street view imagery. We predict all the indicators simultaneously through a single multi-task DCNN model with high performance. The transferability is evaluated across ten cities in the United States. More importantly, we demonstrate that the cross-city prediction accuracy serves as a metric to evaluate the inequality development between urban physical environments and urban socio-economic environments. This work is of significance both scientifically---in shedding light on connections between the physical settings and human activities of a place---and practically---in taking advantage of street view imagery to derive fine-scale socioeconomic information of an urban area and to help urban planners and policymakers in urban studies and planning.

Abstract Information

This abstract is already part of a session. View the session here.

To access contact information login