The ‘Predicting What We Breathe’ initiative will use machine learning with satellite, airborne and ground sensor data that measure particulate matter to create an algorithm that other cities could then use.
Jeanne Holm, Deputy Chief Information Officer and Senior Technology Advisor to the Mayor of Los Angeles, told Cities Today that the problem with space data is that it is often taken from a satellite 30 kilometres above ground and is not very specific, whereas ground sensors are often three metres above the ground and give “very hyperlocal” data.
“The idea is to combine [the two] and see if we can’t characterise what the space data is telling us and then measure the particles after interventions,” she said.
Such interventions can include tree planting initiatives or increasing investigations into pollutants.
“If we work specifically with different areas on different types of interventions we need to know what works best,” she added. “We can then share that with other mega-cities that are struggling with this.”
During the three-year research programme the city will partner with California State University Los Angeles (Cal State LA), and Open AQ, an open-air quality standards group based in Colorado.
Holm is looking for other groups and cities to work with as she noted that it is equally important to coordinate with neighbouring cities considering it is “all the same breathing space.”
“We’re really doing a deep dive into the data, the normalisation of the data, machine learning, and algorithm developments,” she said. “If, for example, LA works really hard on air pollution but San Diego doesn’t, that is going to have an impact.”
Despite decades of programmes to reverse the city’s notoriously high pollution levels–including the strictest car emission laws in the country–the city is continually rated as one of the worst for pollution. LA has topped the American Lung Association’s annual report for 19 of the last 20 years as the “smoggiest” city.