Right now more than 50% of the global population lives in urban areas; by 2050, the UN projects that number to grow to 70%. To accommodate the coming wave of urbanites, the World Economic Forum claims that we’ll need to develop the same amount of infrastructure that’s been built over the past 4,000 years. This will only compound the fact that cities are already responsible for the majority of greenhouse gas emissions, sewage overflows, and air pollution.
Many people believe technology will come to the urbanites’ rescue with mountains of collected data. Everything will have a wireless sensor and be connected to the cloud—from transit card swipes to smart energy meters to air quality monitors to taxi rides to 311 calls. New York City, which produces 1 terabyte of this type of data per day, has a dedicated analytics team to manage and parse through the deluge.
And yet, for most, today’s vision of a smart city remains an elusive luxury. Even if every city had a Chief Information, Data, or Technology Officer, it’s quite likely that in the breathless optimism of chasing a smart city, decision-makers would end up spending billions metering and sensoring every inch, only to discover that the results:
- are obvious (“yep, crossing the 520 bridge in Seattle any time after 4pm is a nightmare, though any local could tell you that”);
- come too late (we spend all that time on data collection, and quietly passed the tipping point of dangerous climate change);
- turn cities into the equivalent of airport security checkpoints, where nothing is private and public distrust is at an all-time high;
- may unintentionally create “automaton” cities where decision-making is so automated or restricted to a small group of data scientists that it becomes devoid of citizen input;
- are smart but unimaginative (they can reflect what’s happening, but not simulate the complexities of what could be).
So how can we make more out of collected data while also simulating alternatives to today’s messy reality? At the risk of oversimplifying a complex topic, it can be boiled down to three fundamentals:
- Create a master repository of data that acts as a “single source of truth”—as with national security intelligence failures in the past, data that lives in silos is the bane of true understanding and decision-making.
- Use that data to imagine future scenarios and simulate possible outcomes—analyze projects from every angle in the virtual world before ever breaking ground in the real one.
- Prove the economic, environmental and social benefits of projects in order to secure increasingly scarce financing.
The first step is to unify the existing but disparate data so that more can be squeezed out of what’s already collected, and all parties are working from a single source of truth. This has been feasible with geographical data files for eons but now it’s possible to introduce dozens of types of information and assemble them into a single 3D-model of a city that really looks and feels like that city, not an impenetrable maze of 2D-grid squares.
The city of Stuttgart in Germany has created such a model, containing buildings, roadways, trees and select city furniture. This unified model can now be used like a canvas for sketching new ideas—and since the model is intelligent, objects update automatically to accommodate new insertions while dumb ideas can be erased with no harm done.
The second step is to give that unified model an “imagination,” just like the human brain can not only collect data but also daydream. On a computer, this involves applying rules-based simulation to determine how the city would behave under future scenarios or how proposed infrastructure projects would perform.
Vancouver has used its unified city model (think SimCity with real data) to simulate how shifts in population affect density and how views would change due to new infrastructure projects. City leaders are now investigating various sustainability simulations they can run on that model.
Such simulations allow cities to move on to a third and possibly deal-breaking step: proving the project worthy of financing. Historically, the best (and most likely) deal has meant lowest first cost. But now planning and public works departments, as well as the private sector financiers increasingly called in to underwrite infrastructure projects, are asking for more sophisticated economic analyses that include the secondary and tertiary environmental and social impacts of a project’s design, like impacts on obesity, air particulates and water quality.
Tucson, which suffers regular water shortages, ran such an analysis using AutoCASE to evaluate the full costs and benefits of green stormwater infrastructure techniques such as pervious pavement and water harvesting. Tucson discovered the total net present value to the city was far higher than traditional calculations had shown, especially when it came to increased pedestrian safety and reduced heat mortality.
So let’s not get too swept up in a technological future that is, at worst, a costly diversion—and at best, unevenly distributed. Let’s instead refocus our discourse on the real promise of data in today’s cities: simulating the thriving and sustainable city that could be, and securing the funds to get us there.
We welcome your comments at ideas@qz.com.
You want to live in a city that uses data in one of these three ways
No comments:
Post a Comment