Thursday, April 9, 2015

Open Data & Traffic Congestion

If you live, work, or even occasionally visit Toronto's downtown core, you already know that traffic congestion is a constant problem. Driving is a hectic, scary mess and public transit is even worse, unless you have some sort of penchant for elbows in your face and an array of weird and equally repelling smells.

On March 31, TomTom (known for their GPS devices - one of which I used to get around Buffalo last week without a hitch) released its 5th annual Traffic Index, the most accurate barometer of traffic congestion in over 200 cities worldwide. As I looked at it, I noticed two things.

1) How interesting that a GPS manufacturer would take such lengths to create interactive data visualizations surrounding traffic conditions. I suppose traffic monitoring is an important feature to get right in a GPS device, among all the free options such as Google Maps for mobile and Waze. You might say that the whole project enforces TomTom's USP – using research and data to get you where you need to go. At least, that's how I see it.

2) Our lovely hometown, Toronto, ranks #47 in traffic congestion worldwide, and second in Canada after Vancouver at #20. The gap isn't that big, either.



After TomTom released this year's Index, Mayor John Tory released a plan he had been hatching that would attempt to fix the traffic problems we are having. He also referred to our position on TomTom's Index, stating that he was "embarrassed" about our standing.

But what I really want to talk about is the most interesting fact; that Tory's plan to fix these problems is largely based in open data. That's the spirit! I have always been a big proponent of open data, freely sharing research is imperative to the overall improvement of our world. This goes hand-in-hand with free healthcare and open-source application development. With complete and total access to all the research that everyone is doing, we can save time and effort by working together to solve those big pesky problems like traffic congestion. Even simple fixes like coordinating traffic lights to turn at the same time, and easing traffic flow by shortening or lengthening green lights in specific directions can only be done by analyzing trends in traffic flow as they stand today. And that can only happen if the data is openly available.

The plan, which you can find here, speaks of multiple actions that will be taken to attempt to solve the problems on our streets. The first actions will include:

  • partnering with McMaster University to analyze historical travel data on city expressways and streets 
  • working with the TTC to closely analyze surface transit data to identify operational improvements to further improve streetcar service 
  • releasing a report from the Cycling Unit of Transportation Services evaluating cycling travel patterns based on data collected from its cycling tracking app -- showing the impacts of Cycletracks 
  • developing a Big Travel Data strategy for Transportation Services to determine ways to make this type of information available, and 
  • vetting products and services that might be useful in assisting the city in better decision making and investments
In addition to those items, the city is seeking a data and transportation specialist to hire as their team lead, as well as proposals for products that can monitor traffic patterns. Vendors of these products will meet with city representatives at an event showcase on April 14 and 15.

What's most exciting to me is that the city is also planning an open data hackathon for September. I would love the chance to enter into another hackathon, especially relating to something so close to home. Everyone has their own opinion about what could improve traffic in the city, and with access to all of the data collected, this really sounds like something special.

But for now, I'll just have to wait and see what develops.

No comments:

Post a Comment