Critical GeoAI

Type: Virtual Paper
Theme:
Sponsor Groups: Digital Geographies Specialty Group, Geographic Information Science and Systems Specialty Group
Poster #:
Day: 4/9/2021
Start / End Time: 11:10 AM / 12:25 PM (PDT)
Room: Virtual 47
Organizers: Renee Sieber, Jeremy Crampton
Chairs: Renee Sieber

Call for Submissions

A substantial literature has emerged to critique artificial intelligence (AI) and machine learning (ML) (e.g. Fairness, Accountability, Transparency in ML {FAccT/ML}--Middlestadt et al. 2016; Pasquale 2015; Eubanks 2018). With few exceptions (Kwan 2016), geographers' voices have been largely absent in this critique of AI/ML (especially deep learning of neural networks). This absence exists despite the prevalent use of geographic examples to illustrate the societal impacts of AI (e.g., insurance costs based on commuting path--Pasquale 2015) and critiques from within computer science of popular deep learning methods used in geography (Hinton 2014). This gap is ever more significant since geographers have long researched AI/ML (cf., Couclelis 1986; Openshaw and Openshaw 1997; Estes et al. 1986) and GeoAI has recently re-emerged as a subset of AI (e.g., Janowicz et al. 2020). Instead, critiques have been led by computer scientists (Dwork 2017 on debiasing; Buolamwini and Gebru 2018 on race and gender bias) and lawyers (Edwards and Veale 2018 on explainability of AI outcomes and data rights). We propose an area of study called Critical GeoAI that analyzes AI through an explicit spatial lens. For this session(s), we are looking for abstracts that consider (but not limited to) the following:
- Geo-political economy of AI (e.g., urban-rural/north-south divides, concentrations of AI resources that effectively create AI citystates, location of IP and patents, trade agreements)
- Feature detection of space and the classification of place. Applications of location (e.g., zip codes, space-time trajectories) and placenames in AI/ML to discriminate against people and places
- Moving beyond FAccT/debiasing to AI and structural racism (Benjamin 2019; Noble 2018)
- Critiques of innovation and disruption and alternatives of slow infrastructures (Barlow & Drew 2020), slow computing (Kitchin and Fraser 2020) and slow AI (Crampton 2020)
- Role of civil society to participate in automated decision making (including the ability to counter AI)
- Intersection between Critical Data Studies and Critical GeoAI, for example the role of data in AI, its collection and curation; geographical source of the training data
- Role of optimization and performance metrics; the geography of standards setting and ethics and trust frameworks
- “Local AI” and, drawing on Castells (2010), the spatial logic of AI “flows”. How AI “travels” between sources of origin and places of application
- Algorithmic colonization, extractivism and Western complicity (Birhane 2020)
- Scalar nature of AI (e.g., digital infrastructures like data centers)
- The physicality of AI via its environmental impacts, including calls for a sustainable AI
- Carceral and decarceral AI

Ultimately, we seek to address what is special about spatial in the critique of AI. How does a geographic lens differ from a legal, computational, communications or political science lens? THIS IS A VIRTUAL SESSION. Please send your abstracts of 250 words or less to renee.sieber and jeremy.crampton@newcastle.ac.uk by Nov 19.

References
M Barlow and G Drew. 2020. Slow infrastructures in times of crisis: unworking speed and convenience. Journal of Postcolonial Studies.
R Benjamin 2019. Race After Technology: Abolitionist Tools for the New Jim Code. Wiley.
A Birhane. "Algorithmic Colonization of Africa." SCRIPTed 17 (2020): 389.
J Buolamwini and T Gebru. 2018. Gender shades: Intersectional accuracy disparities in commercial gender classification. In Conference on fairness, accountability and transparency, 77-91.
M Castells. 2010. The Information Age: Economy, Society and Culture Vol.s I-III. Malden, MA; Oxford, UK: Blackwell.
H Couclelis. 1986. Artificial intelligence in geography: Conjectures on the shape of things to come. The Professional Geographer.
J Crampton. 2020 Think. Resist. Act Local: Is a Slow AI possible? Ada Lovelace Institute
3 March 2020.
https://www.adalovelaceinstitute.org/think-resist-act-local-is-a-slow-ai-possible/
C Dwork. 2017. What's Fair? Proceedings of the 23rd Association for Computing Machinery's Special Interest Group on Knowledge Discovery and Data Mining International Conference on Knowledge Discovery and Data Mining, August 13-17, 2017, Halifax, Nova Scotia, Canada.
L Edwards and M Veale. 2017-2018. Slave to the Algorithm: Why a Right to an Explanation Is Probably Not the Remedy You Are Looking for. 16 Duke L. & Tech. Rev. 18 (2017-2018).
JE Estes, C Sailer, LR Tinney. 1986. Applications of artificial intelligence techniques to remote sensing. The Professional Geographer.
V Eubanks. 2017. Automating Inequality. St. Martin’s Press: New York, NY.
G Hinton. 2014. What is wrong with standard neural nets? Brain & Cognitive Sciences - Fall Colloquium Series. Cambridge, USA: Massachusetts Institute of Technology. December 4, 2014
K Janowicz, S Gao, G McKenzie, Y Hu, and B Bhaduri. 2020. GeoAI: spatially explicit artificial intelligence techniques for geographic knowledge discovery and beyond. International Journal of GIScience.
R Kitchin and A Fraser. 2020. Slow Computing: Why We Need Balanced Digital Lives. Bristol University Press.
M Kwan. 2016. Algorithmic Geographies: Big Data, Algorithmic Uncertainty, and the Production of Geographic Knowledge. Annals of the American Association of Geographers 106(2), 274-282.
BD Mittelstadt, P Allo, M. Taddeo, S Wachter, & L Floridi. 2016. The ethics of algorithms: Mapping the debate. Big Data & Society 3, 2, 2053951716679679.
S Openshaw and C Openshaw. 1997. Artificial intelligence in geography. John Wiley & Sons.
S Noble. 2018. Algorithms of Oppression: How Search Engines Reinforce Racism. New York: New York University Press.
F Pasquale. 2015. The Black Box Society: The Secret Algorithms That Control Money and Information. Harvard University Press: Cambridge, MA.


Description

A substantial literature has emerged to critique artificial intelligence (AI) and machine learning (ML) (e.g. Fairness, Accountability, Transparency in ML {FAccT/ML}--Middlestadt et al. 2016; Pasquale 2015; Eubanks 2018). With few exceptions (Kwan 2016), geographers' voices have been largely absent in this critique of AI/ML (especially deep learning of neural networks). This absence exists despite the prevalent use of geographic examples to illustrate the societal impacts of AI (e.g., insurance costs based on commuting path--Pasquale 2015) and critiques from within computer science of popular deep learning methods used in geography (Hinton 2014). This gap is ever more significant since geographers have long researched AI/ML (cf., Couclelis 1986; Openshaw and Openshaw 1997; Estes et al. 1986) and GeoAI has recently re-emerged as a subset of AI (e.g., Janowicz et al. 2020). Instead, critiques have been led by computer scientists (Dwork 2017 on debiasing; Buolamwini and Gebru 2018 on race and gender bias) and lawyers (Edwards and Veale 2018 on explainability of AI outcomes and data rights). We propose an area of study called Critical GeoAI that analyzes AI through an explicit spatial lens. A critical GeoAI includes, but is not limited to
- Geo-political economy of AI (e.g., urban-rural/north-south divides, concentrations of AI resources that effectively create AI citystates, location of IP and patents, trade agreements)
- Feature detection of space and the classification of place. Applications of location (e.g., zip codes, space-time trajectories) and placenames in AI/ML to discriminate against people and places
- Moving beyond FAccT/debiasing to AI and structural racism (Benjamin 2019; Noble 2018)
- Critiques of innovation and disruption and alternatives of slow infrastructures (Barlow & Drew 2020), slow computing (Kitchin and Fraser 2020) and slow AI (Crampton 2020)
- Role of civil society to participate in automated decision making (including the ability to counter AI)
- Intersection between Critical Data Studies and Critical GeoAI, for example the role of data in AI, its collection and curation; geographical source of the training data
- Role of optimization and performance metrics; the geography of standards setting and ethics and trust frameworks
- “Local AI” and, drawing on Castells (2010), the spatial logic of AI “flows”. How AI “travels” between sources of origin and places of application
- Algorithmic colonization, extractivism and Western complicity (Birhane 2020)
- Scalar nature of AI (e.g., digital infrastructures like data centers)
- The physicality of AI via its environmental impacts, including calls for a sustainable AI
- Carceral and decarceral AI


Agenda

Type Details Minutes Start Time
Presenter Renee Sieber*, McGill University, Jeremy Crampton*, Newcastle University, Possibilities of a Critical GeoAI 15 11:10 AM
Presenter Ryan Burns*, University of Calgary, Jim Thatcher, University of Washington - Tacoma, Craig Dalton, Hofstra University, Wherefore art thou, critical GIS? Differently-skilling and Differently Critical GeoAI 15 11:25 AM
Presenter Kirsty Watkinson*, University of Manchester, Jonathan Huck, University of Manchester, Angela Harris, University of Manchester, Centaur VGI: Evaluating a human-AI workflow to increase the quality and productivity of humanitarian activities 15 11:40 AM
Presenter Susanne Schröder-Bergen*, University Erlangen-Nürnberg, Germany, Georg Glasze, University Erlangen-Nürnberg, Germany, Finn Dammann, University Erlangen-Nürnberg, Germany, Boris Michel, University Halle, Germany, A New Form of Data Colonialism? The use of AI in OpenStreetMap 15 11:55 AM
Presenter Ana Brandusescu*, Centre of Interdisciplinary Research on Montreal, McGill University, Balancing Canada’s geo-political economy of artificial intelligence 15 12:10 PM

To access contact information login