Call for Participation – 2014 TRB Annual Meeting Workshop Comparison of Surrogate Measures of Safety Extracted from Video Data
With the advent of powerful computer vision techniques, video data can be automatically analyzed for an increasing number of transportation applications, including for road safety diagnosis based on surrogate measures of safety that do not require to wait for accidents to happen.
While several methods exist for different purposes and settings, few direct comparisons have been made and no guidelines exist to choose and adjust existing methods for a given application. An important reason is the lack of public datasets and of comparison of state of the art methods on tasks relevant for transportation applications (benchmarking). Public datasets and benchmarking are common in several scientific fields, most notably in computer vision (such as the series of IEEE International Workshop on Performance Evaluation of Tracking and Surveillance (PETS) , with 15 workshops from 2000 to 2013) but not in transportation. There is in fact a lack of public data relevant for transportation applications: some general computer vision datasets can be useful, but cover few real life transportation applications.
A group of researchers from the University of Lund, McGill University and Polytechnique Montréal have decided to create such a public video dataset, in particular for roliad user behaviour analysis and road safety diagnosis. They invite the researchers and practitioners to present the current state of their methods for video analysis, behaviour and safety diagnosis, whether previously published or not.
The workshop will take place at the next TRB annual meeting on Sunday afternoon, January 12th 2014 (1:30pm- 4:30pm in Marriott, Madison A). Everyone is welcome to attend and participate to the discussions.
The information about the workshop, including the links to the datasets, is available on the following webpage: http://nicolas.saunier.confins.net/trb14workshop.html
For more information please contact Nicolas Saunier at email@example.com or Aliaksei Laureshyn at firstname.lastname@example.org.
This project has received funding from the European Union’s Seventh Framework Programme for research, technological development and demonstration under grant agreement no 610453.