ListeningNYC was a mobile app to help people map and explore urban sounds across New York City.

Rather than simply asking people to file ‘noise complaints’ this crowdsourcing app was aimed at discovering and cataloguing the widely varying qualities of noise and how people reacted to them. The goal was to generate a detailed geolocated and semantic description of the sounds of the city, in order eventually to inform the City’s noise policies and help prioritise on-site inspection resources.

The app enabled people to capture, process and collect sound data analyses throughout the city by creating geolocated and timestamped ‘noise fingerprints’ in the form of frequency histogram visualisations. Users could also mark up and tag the sounds with natural language to describe the specific qualities of the sound (e.g. ‘boisterous’, ‘piercing’, ‘rumbling’, ‘faint’, ‘chirping’, ‘traffic’, ‘children playing’) and indicate to what extent they liked or disliked the sound.

By linking the machine processed data (frequency histogram) and the qualitative information supplied by people’s descriptions of the sound, it’s possible to have a much more useful and nuanced understanding of how people actually feel about certain types of urban sound.

The app therefore provided a personalised map of sound qualities and locations across the city and, in conjunction with sound analyses created and shared by others, enabled people to use the app to find areas of the city with either sounds they liked or disliked.

Developed at the Urban Projects Division. iOS programming by The Workers.