New York launches sensors and AR tools in latest tech pilots

26 November 2025

by Jonathan Andrews

New York City is rolling out two new Smart City Testbed pilots that will analyse pedestrian activity in public spaces and help residents visualise a major new community facility through augmented reality.

The initiatives form part of the city’s broader effort to trial emerging technologies that can improve street safety, support infrastructure management, and strengthen public engagement before tools are deployed at scale.

Matthew Fraser, the city’s Chief Technology Officer, said the programme highlights an approach centred on learning and experimentation.

“The NYC Smart City Testbed Program embodies our mindset that there are no failures, only learning opportunities,” he said. “The cutting-edge pilots announced today demonstrate our commitment to developing forward-thinking partnerships and solutions needed to better serve our residents.”

The Testbed Programme, led by the NYC Office of Technology and Innovation (OTI), allows agencies to work with vendors for six to nine months to test emerging tools in live environments. Submissions are accepted on a rolling basis, and earlier pilots have included automated building inspections, parcel-facility traffic monitored through LiDAR, and computer-vision assessments of bike lanes.

OTI and the Department of Transportation (DOT) will launch a pilot to evaluate pedestrian-counting sensors across six locations, including plazas, open streets, and holiday markets. The system will measure anonymised crowd sizes and dwell times to help better understand how public spaces are used and what staffing levels are required to manage them.

A second pilot, developed with the Department of Design and Construction (DDC), will introduce an augmented reality platform enabling residents to view a 3D rendering of the forthcoming Roy Wilkins Recreation Center in Queens. By scanning a QR code at the site, visitors will be able to see the future 6,225 square metres facility projected onto the landscape.

Earlier this year, another pilot used LiDAR scanners to help the Department of City Planning automatically track vehicle movements linked to industrial activity. DOT and OTI also collaborated on two computer-vision pilots: an eight-month analysis of multimodal street activity now being considered for expansion, and a two-month review of protected bike lanes that involved 150,000 images captured over 320 kilometres.

Additional pilots from the programme’s first cohort included drones and robotics to scan roofs and building façades, and real-time air-quality monitoring at a Queens public school.

NYC introduces AI oversight framework

A separate package of legislation passed by the city council sets out one of the first comprehensive municipal frameworks in the US to oversee the use of artificial intelligence and automated decision systems (ADS) by city agencies.

The new law would create an Office of Algorithmic Accountability tasked with analysing algorithmic tools, incorporating public input, and conducting corrective action. It would require the office to publish a list of all AI systems for which it has conducted a pre-deployment assessment.

“For years…agencies were left to guess their way through AI–no standards, no oversight, and no real guidance,” said Jennifer Gutiérrez, Councillor. “Vendors ran ahead while government stood still. The GUARD Act finally ends that. We’re giving agencies the guardrails they never had, so AI can help government instead of confusing it. This package puts an end to the chaos and gives city employees and New Yorkers the accountability and clarity they deserve.”

Image: Peterfactors | Dreamstime.com