Skip navigation
Please use this identifier to cite or link to this item: http://arks.princeton.edu/ark:/88435/dsp010g354j52n
Title: A Window into the Urban-Environmental Front: Concurrent and Multimodal Mobile Sensing for Environmental Monitoring
Authors: Aigbekaen, Jovan
Advisors: Bou-Zeid, Elie
Department: Electrical and Computer Engineering
Certificate Program: Architecture and Engineering Program
Class Year: 2023
Abstract: Urban regions are incredibly complex and dynamic spaces. It is well documented that the composition of the built environment has a significant impact on these factors, but concentrations of pollutants can vary sharply at the local spatiotemporal scale in cities. Urban-environmental monitoring and modeling have been identified as tools for deconstructing these complexities and providing cities with the data to better inform urban policy and design efforts. Street view imagery has emerged as a major data source used in many monitoring and modeling applications. Visual urban-environmental features are documented within street view images. These encoded features can be extracted and quantified using computer-vision based AI algorithms for image processing. Google Street View (GSV) is a primary street view imagery database that has been widely used in many urban modeling studies. However, street view imagery does not possess the capability for live monitoring of meteorological and air quality parameters. Mobile sensing platforms are a proven method for capturing the spatiotemporal dynamics of urban conditions at fine scales to complement information extracted from GSV. However, such works coupling mobile air quality sensing with GSV are limited by the inconsistency of GSV data density in many urban regions and the difference in temporal spans of the sources of data. This thesis proposes the development of a low-cost and real-time urban-environmental monitoring platform which implements concurrent and multimodal collection of air quality, noise pollution, and street view imagery data. Through mobile deployments on private vehicles, the developed sensing and imaging platform demonstrated capabilities to adequately measure and capture the spatiotemporal dynamics within urban spaces. Convolution neural network models pretrained on the open-source ADE20K imaging dataset were used to extract vegetation, built, sky, and traffic features from images and produce numeric indices. These quantified urban-environmental feature indices, along with the coupled mobile sensing air quality and noise data, were used to predict the concentration of the target pollutants of CO2, PM2.5, and noise. Linear regression models for CO2, PM2.5, and noise achieved mean absolute percentage error (MAPE) values of 2.725, 10.627, and 6.996 respectively. The models generated with the use of all the identified urban-environmental indices yielded the greatest accuracy in contrast to models that did not utilize any numeric values extracted from visual features. This thesis demonstrates the utility of the concurrent and multimodal mobile sensing platform for associating street view imagery with pollutant and meteorological information for the application of neighborhood scale spatiotemporal urban monitoring and modeling.
URI: http://arks.princeton.edu/ark:/88435/dsp010g354j52n
Type of Material: Princeton University Senior Theses
Language: en
Appears in Collections:Electrical and Computer Engineering, 1932-2023

Files in This Item:
File Description SizeFormat 
AIGBEKAEN-JOVAN-THESIS.pdf14.87 MBAdobe PDF    Request a copy


Items in Dataspace are protected by copyright, with all rights reserved, unless otherwise indicated.