Please wait a minute...
J4  2010, Vol. 44 Issue (6): 1049-1056    DOI: 10.3785/j.issn.1008-973X.2010.06.001
    
Data processing method of time-of-flight 3D imaging camera
PAN Hua-dong1, WANG Qi-cong2, XIE Bin1, XU Shi-fang1, LIU Ji-lin1
1. Institute of Information and Communication Engineering, Zhejiang University, Zhejiang Provincial Key Laboratory of Information Network Technology, Hangzhou 310027, China; 2. Department of Computer Science, Xiamen University, Xiamen 361005, China
Download:   PDF(0KB) HTML
Export: BibTeX | EndNote (RIS)      

Abstract  

Measurement results of timeofflight 3D imaging camera have problems of center offset, range ambiguity and mixedpixel, and they are sensitive to exposure time and vulnerable to active light source. The measurement data were processed to improve the effectiveness and accuracy of measurement results. The camera was calibrated so that the error was decreased thanks to the transformation from spherical distances to Cartesian coordinates. The ambiguity was eliminated according to the difference between two measurements taken with two different light modulation frequencies alternately. The amplitude decreases with increasing exposure time due to excessive exposure, accordingly, a rapid autoexposure control method based on region of interest was proposed. Boundary mixed pixels are in form of a single point or a single line, accordingly, the location distribution of each pixel in its neighborhood was a used to determine whether the pixel was a mixed pixel. Noise was determined according to the amplitude, meanwhile, the active light source was identified according to the characteristics that the amplitude was very small and the offset was very big when the active light source was measured. The experimental results showed that the data reliability and accuracy were improved with the above methods.

Key words: 



Published: 16 July 2010
CLC:  TP 242.6  
Cite this article:

BO Hua-Dong, WANG Ji-Cong, XIE Bin, HU Shi-Fang, LIU Ji-Lin. Data processing method of time-of-flight 3D imaging camera. J4, 2010, 44(6): 1049-1056.

URL:

http://www.zjujournals.com/eng/10.3785/j.issn.1008-973X.2010.06.001     OR     http://www.zjujournals.com/eng/Y2010/V44/I6/1049


飞行时间法三维成像摄像机数据处理方法研究

飞行时间法三维成像摄像机测量结果存在中心点偏移、距离歧义性和混合像素等问题,且易受曝光时间和主动光源影响.为提高测量结果的有效性和准确性,对测量数据进行了如下处理:对摄像机进行校正,减小了球面距离到三维坐标转换过程中引入的误差;采用2个不同光源调制频率交替进行测量,根据2次测量结果差值消除距离歧义性;过度曝光时测量幅值随曝光时间增大而减小,据此提出了一种基于感兴趣区域的快速自动曝光控制方法;边界混合像素以单点或单线形式出现,根据某像素邻域内各像素位置分布情况判定该像素是否为混合像素;由测量幅值判定噪声大小,根据对主动光源成像时测量幅值很大、偏移值很小的特点辨别主动光源.实验结果表明:以上方法能有效提高数据可靠性和准确性.

[1] KAHMANN T, REMONDINO F, GUILLAUME S. Range imaging technology: new developments and applications for people identification and tracking [C]∥ Proceedings of Videometrics IXSPIEIS&T Electronic Imaging. San Jose: SPIE, 2007: 64910C.164910C.12.

[2] PREMEBIDA C, MONTEIRO G, NUNES U, et al. A Lidar and visionbased approach for pedestrian and vehicle detection and tracking [C]∥ Proceedings of the 2007 IEEE Intelligent Transportation Systems Conference. Seattle: IEEE, 2007: 10441049.

[3] BUTTGEN B, OGGIER T, LEHMANN M. Highspeed and highsensitive demodulation pixel for 3DImaging [C]∥ Proceedings of SPIEIS&T Electronic Imaging. San Jose: SPIE, 2006: 2233.

[4] MESA Imaging AG. SR3000 Data Sheet. [EB/OL]. [20090220].http:∥www.mesaimaging.ch/pdf/SR3K_Flyer_Feb09.pdf.

[5] KAHMANN T, INGENSAND H. Calibration and improvements of the highresolution rangeimaging camera swissranger [C]∥ Proceedings of SPIE 2005. San Jose: SPIE, 2005: 144155.

[6] LANGE R, SEITZ P. Solidstate time of flight range camera [J]. IEEE Journal of Quantum Electronics, 2001, 37(3): 390397.

[7] 马颂德,张正友.计算机视觉——计算理论与算法基础[M].北京:科学出版社,2003.

[8] ZHANG Z. A flexible new technique for camera calibration [J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2000, 22(11): 13301334.

[9] 张奇,刘济林,郭小军,等.激光测距成像雷达三维测距:存在问题及解决方法[J].浙江大学学报:工学版,1998,32(8): 732738.

ZHANG Qi, LIU Jilin, GUO Xiaojun, et al. 3D measurements from imaging laser sensor: problems and gtrategies [J]. Journal of Zhejiang University:Engineering Science, 1998, 32(8): 732738.

[10] 李云雁,胡传荣.试验设计与数据处理[M].北京:化学工业出版社,2008.

[1] CHEN Ming-ya, XIANG Zhi-yu, LIU Ji-lin. Assistance localization method for mobile robot based on
monocular natural visual landmarks
[J]. J4, 2014, 48(2): 285-291.
[2] LIN Ying, GONG Xiao-jin, LIU Ji-lin. Calibration of fisheye cameras based on the viewing sphere[J]. J4, 2013, 47(8): 1500-1507.
[3] WANG Hui-fang, ZHU Shi-qiang, WU Wen-xiang. Improved adaptive robust control of servo system with harmonic drive[J]. J4, 2012, 46(10): 1757-1763.
[4] OUYANG Liu, XU Jin, GONG Xiao-jin, LIU Ji-lin. Optimization of visual odometry based on uncertainty analysis[J]. J4, 2012, 46(9): 1572-1579.
[5] MA Li-sha, ZHOU Wen-hui, GONG Xiao-jin, LIU Ji-lin. Motion constrained generalized Field D* path planning[J]. J4, 2012, 46(8): 1546-1552.
[6] XU Jin, SHEN Min-yi, YANG Li, WANG Wei-qiang, LIU Ji-lin. Binocular bundle adjustment based localization
and terrain stitching for robot
[J]. J4, 2011, 45(7): 1141-1146.
[7] CHEN Jia-qian, LIUYu-tian, HE Yan, JIANG Jing-ping. Novel dynamic mapping method based on occupancy grid
model and sample sets
[J]. J4, 2011, 45(5): 794-798.
[8] CHEN Jia-Gan, HE Yan, JIANG Jing-Ping. Improved FastSLAM algorithm based on importance weight smoothing[J]. J4, 2010, 44(8): 1454-1459.
[9] MEI Gong, ZHANG Zhi-Feng, LAI Huan-Huan. Continuoustime based optimized scheduling of production process[J]. J4, 2010, 44(7): 1423-1427.
[10] XU Sheng-Lin, LIU Yan-Na. Modeling of biped robot by SimMechanics[J]. J4, 2010, 44(7): 1361-1367.
[11] WANG Li, XIONG Rong, CHU Jian, et al. Fuzzy evaluation based exploring planning for map building in unknown environment[J]. J4, 2010, 44(2): 253-258.
[12] CHEN Shao-Bin, JIANG Jing-Ping. Optimal state feedback control for trajectory tracking of four-wheel mobile robot[J]. J4, 2009, 43(12): 2186-2190.