Feasibility of Using V2I Sensing Probe Data for Real-Time Monitoring of Multi-Class Vehicular Traffic Volumes in Unmeasured Road Locations
DOI:
https://doi.org/10.7307/ptt.v34i5.4057Keywords:
V2I communication, V2I probe volume, online monitoring, multiple vehicle classes, motorway traffic volumeAbstract
Portions of dynamic traffic volumes consisting of multiple vehicle classes are accurately monitored without vehicle detectors using vehicle-to-infrastructure (V2I) communication systems. This offers the feasibility of online monitoring of the total traffic volumes with multi-vehicle classes without any advanced vehicle detectors. To evaluate this prospect, this article presents a method of monitoring dynamic multi-class vehicular traffic volumes in a road location where road-side equipment (RSE) for V2I communication is in operation. The proposed method aims to estimate dynamic total traffic volume data for multiple vehicle classes using the V2I sensing probe volume (i.e. partial vehicular traffic volumes) collected through the RSE. An experimental study was conducted using real-world V2I sensing probe volume data. The results showed that traffic volumes for vehicle types I and II (i.e. cars and heavy vehicles, respectively) can be effectively monitored with average errors of 6.69% and 10.89%, respectively, when the penetration rates of the in-vehicle V2I device for the two vehicle types average 0.384 and 0.537, respectively. The performance of the method in terms of detection error is comparable to those of widely used vehicle detectors. Therefore, V2I sensing probe data for multi-vehicle classes can complement the functions of vehicle detectors because the penetration rate of in-vehicle V2I devices is currently high.
References
Caceres N, Romero LM, Benitez FG, del Castillo JM. Traffic flow estimation models using cellular phone data. IEEE Transactions on Intelligent Transportation Systems. 2012;13(3): 1430-1441. doi: 10.1109/TITS.2012.2189006.
Caceres N, Romero LM, Benitez FG. Inferring origin-destination trip matrices from aggregate volumes on groups of links: A case study using volumes inferred from mobile phone data. Journal of Advanced Transportation. 2013;47(7): 650-666. doi: 10.1002/atr.187.
Chang H, Yoon B. Potentialities of autonomous vehicles for online monitoring of motorway traffic volume. Journal of Advanced Transportation. 2018;2018(Sep): 1-12. doi: 10.1155/2018/4276593.
Vlahogianni EI, Karlaftis MG, Golias JC. Statistical methods for detecting nonlinearity and non-stationarity in univariate short-term time-series of traffic volume. Transportation Research Part C. 2006;14(5): 351-367. doi: 10.1016/j.trc.2006.09.002.
Yoon B, Chang H. Potentialities of data-driven nonparametric regression in urban signalized traffic forecasting. Journal of Transportation Engineering. 2014;140(7). doi: 10.1061/(ASCE)TE.1943-5436.0000662.
Chang H, Yoon B. High-speed data-driven methodology for real-time traffic flow predictions: Practical applications of ITS. Journal of Advanced Transportation. 2018;2018(Apr): 1-11. doi: 10.1155/2018/5728042.
Chang H, Cheon S. The potential use of big vehicle GPS data for estimations of annual average daily traffic for unmeasured road segments. Transportation. 2019;46(3): 1011-1032. doi: 10.1007/s11116-018-9903-6.
Vlahogianni EI, Karlaftis MG, Golias JC. Optimized and meta-optimized neural networks for short-term traffic flow prediction: A genetic approach. Transportation Research Part C. 2005;13(3): 211-234. doi: 10.1016/j.trc.2005.04.007.
Bellucci P, Cipriani E. Data accuracy on automatic traffic counting: The SMART project results. European Transport Research Review. 2010;2(4): 175-187. doi: 10.1007/s12544-010-0039-9.
Chang H, Park D. Surveying annual average daily traffic volumes using the trip connectivity function of vehicle GPS in an urban road network. International Journal of Urban Sciences. 2021;25(2): 193-207. doi: 10.1080/12265934.2020.1816206.
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2022 Hyunho Chang, Seunghoon Cheon
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.