Authors: Chengbi Liu*, George Mason University
Topics: Geographic Information Science and Systems, Communication
Keywords: Augmented Reality, GIS, social media
Session Type: Paper
Start / End Time: 5:00 PM / 6:40 PM
Room: Wilson B, Marriott, Mezzanine Level
Presentation File: No File Uploaded
Through the prevalence of smartphones equipped with positioning modules, location-based social networks (LBSN) enable users to easily generate, view, and interact with spatial quantitative and qualitative information, which has become an important data source for GIScience researchers today. Traditionally, LBSN acquire spatial location information at the Point of Interest (POI) as a string of addresses or a set of coordinates. However, the locality information of users, such as brightness, colorfulness, or particular objects/events in the surrounding environment when people create content, is usually neglected. Thankfully, with the rise of AR (augmented reality)-integrated LBSN, information of users’ surrounding environment has become readily available. In this study, we introduce a novel approach towards extracting and summarizing such hyper-local visual information from Wallame, an AR-integrated LBSN. Specifically, we compute the brightness and colorfulness indices of users’ background from pictures uploaded in WallaMe. Then, we compare the indices with characteristics of user-generate content (UGC), such as sentiment value and text length, to find their correlations. Finally, spatial analysis is implemented to identify spatial patterns. Positive results have been found with the analyses. This research provides a new approach towards analyzing hyper-local visual characteristics generated by AR-integrated LBSN and has demonstrated a potential connection between characteristics of the surrounding environment and UGC.