Calibration and 3D reconstruction with omnidirectional ranging by optic flow camera
MA Zi ang, XIANG Zhi yu
Department of Information Science and Electronic Engineer, Zhejiang University, Hangzhou 310027, China;Zhejiang Provincial Key Laboratory of Information Network Technology, Zhejiang Province, Hangzhou 310027, China
There are limitations when applying active sensing in designing guidance systems for UAVs, which can be bulky and expensive. An omnidirectional ranging by optic flow camera was introduced, which was composed of a traditional perspective camera and a reflective mirror with certain imaging characteristic. The imaging characteristic helped establish a simple relationship between the distance from space point to the optical axis of camera and the value of optic flow. Thus, 3D reconstruction of test scene could be realized directly. As a result, the ranging accuracy of the system is mainly influenced by the angular error of installation through simulation experiments. Therefore, a new parameters calibration method was proposed to assist in improving ranging accuracy. Results show that 3D reconstruction can be achieved by exploiting the designed camera to the stereo scene. The experimental results demonstrate the effectiveness of the proposed calibration and reconstruction method.
MA Zi ang, XIANG Zhi yu. Calibration and 3D reconstruction with omnidirectional ranging by optic flow camera. JOURNAL OF ZHEJIANG UNIVERSITY (ENGINEERING SCIENCE), 2015, 49(9): 1651-1657.
[1] SRINIVASAN M V. A new class of mirrors for wide angle imaging [C] ∥ Computer Vision and Pattern Recognition Workshop. Madison: IEEE, 2003, 7: 85.
[2] FERNANDO C G, MUNASINGHE R, CHITTOORU J. Catadioptric vision systems: survey [C] ∥ System Theory. Piscataway: IEEE, 2005: 443-446.
[3] COLOMBO A, MATTEUCCI M, SORRENTI D G. On the calibration of non single viewpoint catadioptric sensors [M] ∥ RoboCup 2006: Robot Soccer World Cup X. Berlin:Springer, 2007: 194-205.
[4] MASHITA T, YACHIDA M. Calibration method for misaligned catadioptric camera [J]. IEICE Transactions on Information and Systems, 2006, 89(7): 1984-1993.
[5] FABRIZIO J, TAREL J P, BENOSMAN R. Calibration of panoramic catadioptric sensors made easier [C] ∥ Omnidirectional Vision. Copenhagen: IEEE, 2002: 45-52.
[6] MOREL O, FOFI D. Calibration of catadioptric sensors by polarization imaging [C] ∥ Robotics and Automation. Beijing: IEEE, 2007: 3939-3944.
[7] DERRIEN S, KONOLIGE K. Approximating a single viewpoint in panoramic imaging devices [C] ∥ Robotics and Automation. San Francisco: IEEE, 2000:3931-3938.
[8] BRASSART E, DELAHOCHE L, CAUCHOIS C, et al. Experimental results got with the omnidirectional vision sensor: SYCLOP [C] ∥ Omnidirectional Vision. South Carolina: IEEE, 2000: 145-152.
[9] CHAHL J S, SRINIVASAN M V. Reflective surfaces for panoramic imaging [J]. Applied Optics, 1997, 36(31): 8275-8285.
[10] SRINIVASAN M V, THURROWGOOD S, SOCCOL D. An optical system for guidance of terrain following in UAVs [C] ∥ Video and Signal Based Surveillance. Sydney: IEEE, 2006: 51.
[11] SOCCOL D, THURROWGOOD S, SRINIVASAN M V. A vision system for optic flow based guidance of UAVs [C] ∥ Proceedings of the Australasian Conference on Robotics and Automation. Brisbane: ACRA, 2007.
[12] INGARD K U. Fundamentals of waves and oscillations [M]. Cambridge: Cambridge University Press, 1988.
[13] LUCAS B D, KANADE T. An iterative image registration technique with an application to stereo vision [C] ∥ International Joint Conference on Artificial Intelligence. Vancouver: IJCAI, 1981, 81: 674679.
[14] BOUGUET J Y. Pyramidal implementation of the affine lucas kanade feature tracker description of the algorithm [J]. Intel Corporation, 2001, 5: 1-10.
[15] CANNY J. A computational approach to edge detection [J]. IEEE Transactions on Pattern Analysis and Machine Intelligence. 1986, (6): 679-698.
[16] FISCHLER M A, BOLLES R C. Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography [J]. Communications of the ACM, 1981, 24(6): 381-395.