跨相机交通场景下的车辆空间定位方法
Spatial Positioning Method of Vehicle in Cross-Camera Traffic Scene
-
摘要: 为解决交通应用中的跨相机全场景车辆空间定位问题,利用单目相机,提出一种全场景透视拼接及车辆3D检测相结合的方法.首先提出一种交通场景下的相机自动标定方法,通过构建相机标定空间模型,自动求取标定参数并优化;然后利用公共区域的特征点,将多相机空间坐标系进行变换统一,为直观体现全场景物理空间,将各场景透视变换至统一的像素-物理坐标系中;最后,针对单目视觉下的车辆3D定位难题,利用车辆投影的几何约束,建立优化模型构建精细化的3D包络,以3D包络质心作为车辆空间定位点,同时可映射至像素-物理坐标系,体现车辆的动态信息.在具有人工特征点的实验环境及无人工特征点的实际应用环境中,对于跨场景的多类型车辆进行空间定位实验验证,实验结果表明,该方法可解决交通视频监控中的车辆跨相机大场景下的空间定位难题,在选取的实验场景中综合定位精度可达到厘米级,且实时性较好.Abstract: In order to solve the problem of cross-camera and full-scene vehicle spatial positioning in traffic applications,a method combining full-scene perspective stitching and 3D vehicle detection is proposed.Firstly,a method of automatic camera calibration in traffic scenes is proposed.By building a camera calibra-tion space model,the calibration parameters can be obtained and optimized automatically.Then,feature points in the overlapping area are used to transform and unify the multi-camera spatial coordinate system.In order to intuitively reflect the physical space of the whole scene,the perspective of each scene is trans-formed into a unified pixel-physical coordinate system.Finally,for the problem of 3D positioning of vehi-cles under monocular vision,an optimization model is established to construct a refined 3D bounding box by using the geometric constraints of vehicle projection.The 3D centroid is used as the vehicle spatial posi-tioning point,and the mapping to the pixel-physical coordinate system can reflect the dynamic information of vehicles.In the experimental environment with artificial feature points and the practical application envi-ronment without artificial feature points,the spatial positioning experiment is carried out for multi-type vehicles in cross-camera scene.Experimental results show that the algorithm can solve the problem of vehicle positioning in the large cross-camera scene of traffic video surveillance.The comprehensive positioning ac-curacy in the selected experimental scene can reach the centimeter level with a real-time performance.