Select Page

Chicago Readies the Next-Generation Big Data Network

By Rob Mitchum // June 4, 2014

The movement towards open data from city governments has inspired the development of new methods for data analysis. But what about new methods for the collection of data? Beginning this summer, the CI’s Urban Center for Computation and Datawill work with the City of Chicago to install 30 to 50 “sensor nodes” on light poles in the downtown area, giving researchers and the community new streams of information on climate, traffic, city infrastructure, and other facets of city life. The project, called The Array of Things, is starting to attract media attention, including this article from GCN on the city’s “next-generation big data network, which spoke to UrbanCCD director Charlie Catlett.

“This push towards transparency has created a real opportunity for those of us who want to understand cities,” said Charlie Catlett, senior computer scientist at the Argonne National Laboratory and the University of Chicago, who said he believes the city has moved aggressively to make more data available.

Chicago is currently working to develop its next generation of big data tools. On the data gathering side, Catlett said, it is partnering with industry and the Urban Center for Computation and Data (UrbanCCD) to create an embedded sensor network dubbed the “Array of Things.”

Catlett said nodes in the array network will have about 12 sensors collecting data on temperature, humidity, air quality, sound, light and carbon monoxide/carbon dioxide levels throughout the city. The sensors will also be able to detect mobile devices with Bluetooth active, so smartphones can serve as an indicator of pedestrian density.

The first of the sensors are slated for installation in early July, when 30 to 50 nodes will be deployed on light poles in downtown Chicago street corners. A funding proposal is in the works to support 500 additional installations.

Catlett was also quoted in a recent Government Technology article looking at the broader trend towards city sensor networks, and the large expenses those projects may incur.

The actual cost of a sensor can be quite low, depending on its features and capabilities, but the full cost of an entire sensor-based solution can be very high for cities. “I’ve seen lots of simple solutions out there that cost a lot of money,” said Charlie Catlett, director of the Urban Center for Computation and Data, a joint initiative of the University of Chicago and Argonne National Laboratory. He pointed out how the city of Chicago kept some of the costs down by putting a GPS sensor on every city-owned vehicle and then using the stream of data to give residents and commuters real-time information about traffic congestion. Catlett thinks cities could be savvier about creating home-grown solutions, like Chicago’s, when it comes to using sensors.

We’ll have more information on The Array of Things soon. But if you can’t wait, Charlie Catlett speaks at the Chicago Architecture Foundation on June 4th at 12:15 p.m., as part of their LunchTalks series accompanying the City of Big Data exhibit.

[Photo from Wikimedia Commons]