Please wait a minute...
浙江大学学报(工学版)
计算机技术、信息电子     
光流测距全向相机的标定与三维重构
马子昂,项志宇
浙江大学 信息与电子工程学系,浙江 杭州 310027;浙江省综合信息网技术重点实验室,浙江 杭州 310027
Calibration and 3D reconstruction with omnidirectional ranging by optic flow camera
MA Zi ang, XIANG Zhi yu
Department of Information Science and Electronic Engineer, Zhejiang University, Hangzhou 310027, China;Zhejiang Provincial Key Laboratory of Information Network Technology, Zhejiang Province, Hangzhou 310027, China
 全文: PDF(2036 KB)   HTML
摘要:

针对基于主动式传感器的无人驾驶飞机导航系统在实际应用中存在的体积庞大、成本高等问题,介绍一种由传统透视相机和具有等距性的反射镜面组成的相机镜面系统. 镜面的等距性使得空间点到相机光轴的轴心距与图像光流值之间存在一种简单的对应关系,使用该相机镜面系统可以直接对场景进行测距. 通过仿真实验发现,系统测距精度主要受镜面与相机的角度安装误差的影响. 提出一种新的参数标定方法来改善测距精度. 结果表明:将相机应用于实际场景,可以较好地实现环境的三维重构,证明所提出的标定与重构方法的有效性.

Abstract:

There are limitations when applying active sensing in designing guidance systems for UAVs, which can be bulky and expensive. An omnidirectional ranging by optic flow camera was introduced, which was composed of a traditional perspective camera and a reflective mirror with certain imaging characteristic. The imaging characteristic helped establish a simple relationship between the distance from space point to the optical axis of  camera and the value of optic flow. Thus, 3D reconstruction of test scene could be realized directly. As a result, the ranging accuracy of the system is mainly influenced by the angular error of installation through simulation experiments. Therefore, a new parameters calibration method was proposed to assist in improving ranging accuracy. Results show that 3D reconstruction can be achieved by exploiting the designed camera to the stereo scene. The experimental results demonstrate the effectiveness of the proposed calibration and reconstruction method.

出版日期: 2015-10-15
:  TP 242.6  
基金资助:

 国家自然科学基金资助项目(NSFC61071219)

通讯作者: 项志宇,男,副教授. ORCID:0000 0002 3329 7037.     E-mail: xiangzy@zju.edu.cn
作者简介: 马子昂(1991-),男,博士生,从事基于相机标定、光流测距的相关研究. ORCID: 0000 0001 8241 5303. E-mail: bean81bryant@163.com
服务  
把本文推荐给朋友
加入引用管理器
E-mail Alert
RSS
作者相关文章  

引用本文:

马子昂,项志宇. 光流测距全向相机的标定与三维重构[J]. 浙江大学学报(工学版), 10.3785/j.issn.1008 973X.2015.09.006.

MA Zi ang, XIANG Zhi yu. Calibration and 3D reconstruction with omnidirectional ranging by optic flow camera. JOURNAL OF ZHEJIANG UNIVERSITY (ENGINEERING SCIENCE), 10.3785/j.issn.1008 973X.2015.09.006.

链接本文:

http://www.zjujournals.com/eng/CN/10.3785/j.issn.1008 973X.2015.09.006        http://www.zjujournals.com/eng/CN/Y2015/V49/I9/1651

[1] SRINIVASAN M V. A new class of mirrors for wide angle imaging [C] ∥ Computer Vision and Pattern Recognition Workshop. Madison: IEEE, 2003, 7: 85.
[2] FERNANDO C G, MUNASINGHE R, CHITTOORU J. Catadioptric vision systems: survey [C] ∥ System Theory. Piscataway: IEEE, 2005: 443-446.
[3] COLOMBO A, MATTEUCCI M, SORRENTI D G. On the calibration of non single viewpoint catadioptric sensors [M] ∥ RoboCup 2006: Robot Soccer World Cup X. Berlin:Springer, 2007: 194-205.
[4] MASHITA T, YACHIDA M. Calibration method for misaligned catadioptric camera [J]. IEICE Transactions on Information and Systems, 2006, 89(7): 1984-1993.
[5] FABRIZIO J, TAREL J P, BENOSMAN R. Calibration of panoramic catadioptric sensors made easier [C] ∥ Omnidirectional Vision. Copenhagen: IEEE, 2002: 45-52.
[6] MOREL O, FOFI D. Calibration of catadioptric sensors by polarization imaging [C] ∥ Robotics and Automation. Beijing: IEEE, 2007: 3939-3944.
[7] DERRIEN S, KONOLIGE K. Approximating a single viewpoint in panoramic imaging devices [C] ∥ Robotics and Automation. San Francisco: IEEE, 2000:3931-3938.
[8] BRASSART E, DELAHOCHE L, CAUCHOIS C, et al. Experimental results got with the omnidirectional vision sensor: SYCLOP [C] ∥ Omnidirectional Vision. South Carolina: IEEE, 2000: 145-152.
[9] CHAHL J S, SRINIVASAN M V. Reflective surfaces for panoramic imaging [J]. Applied Optics, 1997, 36(31): 8275-8285.
[10] SRINIVASAN M V, THURROWGOOD S, SOCCOL D. An optical system for guidance of terrain following in UAVs [C] ∥ Video and Signal Based Surveillance. Sydney: IEEE, 2006: 51.
[11] SOCCOL D, THURROWGOOD S, SRINIVASAN M V. A vision system for optic flow based guidance of UAVs [C] ∥ Proceedings of the Australasian Conference on Robotics and Automation. Brisbane: ACRA, 2007.
[12] INGARD K U. Fundamentals of waves and oscillations [M]. Cambridge: Cambridge University Press, 1988.
[13] LUCAS B D, KANADE T. An iterative image registration technique with an application to stereo vision [C] ∥ International Joint Conference on Artificial Intelligence. Vancouver: IJCAI, 1981, 81: 674679.
[14] BOUGUET J Y. Pyramidal implementation of the affine lucas kanade feature tracker description of the algorithm [J]. Intel Corporation, 2001, 5: 1-10.
[15] CANNY J. A computational approach to edge detection [J]. IEEE Transactions on Pattern Analysis and Machine Intelligence. 1986, (6): 679-698.
[16] FISCHLER M A, BOLLES R C. Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography [J]. Communications of the ACM, 1981, 24(6): 381-395.

[1] 贾松敏,卢迎彬,王丽佳,李秀智,徐涛. 分层特征移动机器人行人跟踪[J]. 浙江大学学报(工学版), 2016, 50(9): 1677-1683.
[2] 江文婷, 龚小谨, 刘济林. 基于增量计算的大规模场景致密语义地图构建[J]. 浙江大学学报(工学版), 2016, 50(2): 385-391.
[3] 王立军,黄忠朝,赵于前. 基于超像素分割的空间相关主题模型及场景分类方法[J]. 浙江大学学报(工学版), 2015, 49(3): 402-408.
[4] 曹腾,项志宇,刘济林. 基于视差空间V-截距的障碍物检测[J]. 浙江大学学报(工学版), 2015, 49(3): 409-414.
[5] 卢维, 项志宇, 于海滨, 刘济林. 基于自适应多特征表观模型的目标压缩跟踪[J]. 浙江大学学报(工学版), 2014, 48(12): 2132-2138.
[6] 陈明芽, 项志宇, 刘济林. 单目视觉自然路标辅助的移动机器人定位方法[J]. J4, 2014, 48(2): 285-291.
[7] 林颖, 龚小谨, 刘济林. 基于单位视球的鱼眼相机标定方法[J]. J4, 2013, 47(8): 1500-1507.
[8] 王会方, 朱世强, 吴文祥. 谐波驱动伺服系统的改进自适应鲁棒控制[J]. J4, 2012, 46(10): 1757-1763.
[9] 欧阳柳,徐进,龚小谨,刘济林. 基于不确定性分析的视觉里程计优化[J]. J4, 2012, 46(9): 1572-1579.
[10] 马丽莎, 周文晖, 龚小谨, 刘济林. 基于运动约束的泛化Field D*路径规划[J]. J4, 2012, 46(8): 1546-1552.
[11] 路丹晖, 周文晖, 龚小谨, 刘济林. 视觉和IMU融合的移动机器人运动解耦估计[J]. J4, 2012, 46(6): 1021-1026.
[12] 徐进,沈敏一,杨力,王炜强,刘济林. 基于双目光束法平差的机器人定位与地形拼接[J]. J4, 2011, 45(7): 1141-1146.
[13] 陈家乾,柳玉甜,何衍,蒋静坪. 基于栅格模型和样本集合的动态环境地图创建[J]. J4, 2011, 45(5): 794-798.
[14] 陈家乾, 何衍, 蒋静坪. 基于权值平滑的改良FastSLAM算法[J]. J4, 2010, 44(8): 1454-1459.
[15] 鲁仁全, 魏强, 薛安克. 基于线性量化的网络控制系统状态观测器设计[J]. J4, 2010, 44(7): 1400-1405.