Please wait a minute...
J4  2010, Vol. 44 Issue (6): 1049-1056    DOI: 10.3785/j.issn.1008-973X.2010.06.001
自动化技术、计算机技术     
飞行时间法三维成像摄像机数据处理方法研究
潘华东1, 王其聪2, 谢斌1, 许世芳1, 刘济林1
1.浙江大学 信息与通信工程研究所, 浙江省综合信息网技术重点实验室, 浙江 杭州 310027; 2.厦门大学 计算机系, 福建 厦门 361005
Data processing method of time-of-flight 3D imaging camera
PAN Hua-dong1, WANG Qi-cong2, XIE Bin1, XU Shi-fang1, LIU Ji-lin1
1. Institute of Information and Communication Engineering, Zhejiang University, Zhejiang Provincial Key Laboratory of Information Network Technology, Hangzhou 310027, China; 2. Department of Computer Science, Xiamen University, Xiamen 361005, China
 全文: PDF  HTML
摘要:

飞行时间法三维成像摄像机测量结果存在中心点偏移、距离歧义性和混合像素等问题,且易受曝光时间和主动光源影响.为提高测量结果的有效性和准确性,对测量数据进行了如下处理:对摄像机进行校正,减小了球面距离到三维坐标转换过程中引入的误差;采用2个不同光源调制频率交替进行测量,根据2次测量结果差值消除距离歧义性;过度曝光时测量幅值随曝光时间增大而减小,据此提出了一种基于感兴趣区域的快速自动曝光控制方法;边界混合像素以单点或单线形式出现,根据某像素邻域内各像素位置分布情况判定该像素是否为混合像素;由测量幅值判定噪声大小,根据对主动光源成像时测量幅值很大、偏移值很小的特点辨别主动光源.实验结果表明:以上方法能有效提高数据可靠性和准确性.

Abstract:

Measurement results of timeofflight 3D imaging camera have problems of center offset, range ambiguity and mixedpixel, and they are sensitive to exposure time and vulnerable to active light source. The measurement data were processed to improve the effectiveness and accuracy of measurement results. The camera was calibrated so that the error was decreased thanks to the transformation from spherical distances to Cartesian coordinates. The ambiguity was eliminated according to the difference between two measurements taken with two different light modulation frequencies alternately. The amplitude decreases with increasing exposure time due to excessive exposure, accordingly, a rapid autoexposure control method based on region of interest was proposed. Boundary mixed pixels are in form of a single point or a single line, accordingly, the location distribution of each pixel in its neighborhood was a used to determine whether the pixel was a mixed pixel. Noise was determined according to the amplitude, meanwhile, the active light source was identified according to the characteristics that the amplitude was very small and the offset was very big when the active light source was measured. The experimental results showed that the data reliability and accuracy were improved with the above methods.

Key words: 

出版日期: 2010-07-16
:  TP 242.6  
基金资助:

国家自然科学基金重点资助项目(60534070);国家自然科学基金资助项目(60302013).

通讯作者: 刘济林,男,教授,博导.     E-mail: liujl@zju.edu.cn
作者简介: 潘华东(1980—),男,浙江金华人,博士生,从事三维成像、计算机视觉的研究. E-mail: phd_zju@yahoo.com.cn
服务  
把本文推荐给朋友
加入引用管理器
E-mail Alert
RSS
作者相关文章  

引用本文:

潘华东, 王其聪, 谢斌, 许世芳, 刘济林. 飞行时间法三维成像摄像机数据处理方法研究[J]. J4, 2010, 44(6): 1049-1056.

BO Hua-Dong, WANG Ji-Cong, XIE Bin, HU Shi-Fang, LIU Ji-Lin. Data processing method of time-of-flight 3D imaging camera. J4, 2010, 44(6): 1049-1056.

链接本文:

http://www.zjujournals.com/eng/CN/10.3785/j.issn.1008-973X.2010.06.001        http://www.zjujournals.com/eng/CN/Y2010/V44/I6/1049

[1] KAHMANN T, REMONDINO F, GUILLAUME S. Range imaging technology: new developments and applications for people identification and tracking [C]∥ Proceedings of Videometrics IXSPIEIS&T Electronic Imaging. San Jose: SPIE, 2007: 64910C.164910C.12.

[2] PREMEBIDA C, MONTEIRO G, NUNES U, et al. A Lidar and visionbased approach for pedestrian and vehicle detection and tracking [C]∥ Proceedings of the 2007 IEEE Intelligent Transportation Systems Conference. Seattle: IEEE, 2007: 10441049.

[3] BUTTGEN B, OGGIER T, LEHMANN M. Highspeed and highsensitive demodulation pixel for 3DImaging [C]∥ Proceedings of SPIEIS&T Electronic Imaging. San Jose: SPIE, 2006: 2233.

[4] MESA Imaging AG. SR3000 Data Sheet. [EB/OL]. [20090220].http:∥www.mesaimaging.ch/pdf/SR3K_Flyer_Feb09.pdf.

[5] KAHMANN T, INGENSAND H. Calibration and improvements of the highresolution rangeimaging camera swissranger [C]∥ Proceedings of SPIE 2005. San Jose: SPIE, 2005: 144155.

[6] LANGE R, SEITZ P. Solidstate time of flight range camera [J]. IEEE Journal of Quantum Electronics, 2001, 37(3): 390397.

[7] 马颂德,张正友.计算机视觉——计算理论与算法基础[M].北京:科学出版社,2003.

[8] ZHANG Z. A flexible new technique for camera calibration [J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2000, 22(11): 13301334.

[9] 张奇,刘济林,郭小军,等.激光测距成像雷达三维测距:存在问题及解决方法[J].浙江大学学报:工学版,1998,32(8): 732738.

ZHANG Qi, LIU Jilin, GUO Xiaojun, et al. 3D measurements from imaging laser sensor: problems and gtrategies [J]. Journal of Zhejiang University:Engineering Science, 1998, 32(8): 732738.

[10] 李云雁,胡传荣.试验设计与数据处理[M].北京:化学工业出版社,2008.

[1] 陈明芽, 项志宇, 刘济林. 单目视觉自然路标辅助的移动机器人定位方法[J]. J4, 2014, 48(2): 285-291.
[2] 林颖, 龚小谨, 刘济林. 基于单位视球的鱼眼相机标定方法[J]. J4, 2013, 47(8): 1500-1507.
[3] 王会方, 朱世强, 吴文祥. 谐波驱动伺服系统的改进自适应鲁棒控制[J]. J4, 2012, 46(10): 1757-1763.
[4] 欧阳柳,徐进,龚小谨,刘济林. 基于不确定性分析的视觉里程计优化[J]. J4, 2012, 46(9): 1572-1579.
[5] 马丽莎, 周文晖, 龚小谨, 刘济林. 基于运动约束的泛化Field D*路径规划[J]. J4, 2012, 46(8): 1546-1552.
[6] 徐进,沈敏一,杨力,王炜强,刘济林. 基于双目光束法平差的机器人定位与地形拼接[J]. J4, 2011, 45(7): 1141-1146.
[7] 陈家乾,柳玉甜,何衍,蒋静坪. 基于栅格模型和样本集合的动态环境地图创建[J]. J4, 2011, 45(5): 794-798.
[8] 陈家乾, 何衍, 蒋静坪. 基于权值平滑的改良FastSLAM算法[J]. J4, 2010, 44(8): 1454-1459.
[9] 徐生林, 刘艳娜. 两足机器人的SimMechanics建模[J]. J4, 2010, 44(7): 1361-1367.
[10] 梅红, 张智丰, 赖欢欢. 基于连续时间的生产过程优化调度[J]. J4, 2010, 44(7): 1423-1427.
[11] 王立, 熊蓉, 褚健, 等. 基于模糊评价的未知环境地图构建探测规划[J]. J4, 2010, 44(2): 253-258.
[12] 陈少斌, 蒋静坪. 四轮移动机器人轨迹跟踪的最优状态反馈控制[J]. J4, 2009, 43(12): 2186-2190.