Please wait a minute...
Front. Inform. Technol. Electron. Eng.  2014, Vol. 15 Issue (3): 174-186    DOI: 10.1631/jzus.C1300194
    
基于K-最近邻域搜寻的ToF深度摄像机和被动立体深度获取的融合技术研究
Li-wei Liu, Yang Li, Ming Zhang, Liang-hao Wang, Dong-xiao Li
Institute of Information and Communication Engineering, Zhejiang University, Hangzhou 310027, China; Zhejiang Provincial Key Laboratory of Information Network Technology, Hangzhou 310027, China
K-nearest neighborhood based integration of time-of-flight cameras and passive stereo for high-accuracy depth maps
Li-wei Liu, Yang Li, Ming Zhang, Liang-hao Wang, Dong-xiao Li
Institute of Information and Communication Engineering, Zhejiang University, Hangzhou 310027, China; Zhejiang Provincial Key Laboratory of Information Network Technology, Hangzhou 310027, China
 全文: PDF 
摘要: 研究目的:场景深度获取是目前三维显示研究领域中最关键的技术。目前主流的深度获取方式主要有两种:基于双目图像的被动立体匹配技术和基于主动光源测距的ToF深度摄像系统。两种方式获取的深度各有优缺点,本文旨在分析两者优缺点,融合两种方法获取的结果,形成质量更好的场景深度。
创新要点:利用ToF深度摄像机的优势区域指导立体匹配的过程,优化了立体匹配的结果,同时提出了一种新的代价优化深度融合算法,将ToF深度摄像机的采集结果和立体匹配产生的深度融合成精度更高的深度图。
研究方法:主要包含两部分算法,流程可见图1。首先,利用ToF深度摄像机提供的深度测量图和对应的光强振幅图构建能量函数,利用该能量函数,结合K-最近邻域算法,指导原始立体匹配过程。然后,将优化后的立体匹配结果和TOF深度图结合,构建代价函数,选取最优深度解,作为最终融合深度。
重要结论:实验结果显示,本文采用的算法获取的深度图优于单一主动或被动方法获取的深度图,也优于另一类全局优化的深度融合算法。
关键词: 深度图被动立体ToF深度摄像机融合    
Abstract: Both time-of-flight (ToF) cameras and passive stereo can provide the depth information for their corresponding captured real scenes, but they have innate limitations. ToF cameras and passive stereo are intrinsically complementary for certain tasks. It is desirable to appropriately leverage all the available information by ToF cameras and passive stereo. Although some fusion methods have been presented recently, they fail to consider ToF reliability detection and ToF based improvement of passive stereo. As a result, this study proposes an approach to integrating ToF cameras and passive stereo to obtain high-accuracy depth maps. The main contributions are: (1) An energy cost function is devised to use data from ToF cameras to boost the stereo matching of passive stereo; (2) A fusion method is used to combine the depth information from both ToF cameras and passive stereo to obtain high-accuracy depth maps. Experiments show that the proposed approach achieves improved results with high accuracy and robustness.
Key words: Depth map    Passive stereo    Time-of-flight camera    Fusion
收稿日期: 2013-07-15 出版日期: 2014-03-05
CLC:  TP317.4  
服务  
把本文推荐给朋友
加入引用管理器
E-mail Alert
RSS
作者相关文章  
Li-wei Liu
Yang Li
Ming Zhang
Liang-hao Wang
Dong-xiao Li

引用本文:

Li-wei Liu, Yang Li, Ming Zhang, Liang-hao Wang, Dong-xiao Li. K-nearest neighborhood based integration of time-of-flight cameras and passive stereo for high-accuracy depth maps. Front. Inform. Technol. Electron. Eng., 2014, 15(3): 174-186.

链接本文:

http://www.zjujournals.com/xueshu/fitee/CN/10.1631/jzus.C1300194        http://www.zjujournals.com/xueshu/fitee/CN/Y2014/V15/I3/174

[1] Da-fang Zhang, Dan Chen, Yan-biao Li, Kun Xie, Tong Shen. 虚拟化路由器中基于融合再拆分的多表压缩及快速重构机制[J]. Front. Inform. Technol. Electron. Eng., 2016, 17(12): 1266-1274.
[2] Qian-shan Li, Rong Xiong, Shoudong Huang, Yi-ming Huang. 一种利用半稠密点云及RGB图像构建稠密表面模型地图的方法[J]. Front. Inform. Technol. Electron. Eng., 2015, 16(7): 594-606.
[3] Qi-rong Mao, Xin-yu Pan, Yong-zhao Zhan, Xiang-jun Shen. 基于Kinect的实时面部情感识别[J]. Front. Inform. Technol. Electron. Eng., 2015, 16(4): 272-282.
[4] Yang Chen, Zheng Qin. 基于梯度的压缩感知图像融合[J]. Front. Inform. Technol. Electron. Eng., 2015, 16(3): 227-237.
[5] Jie Chen, Can-jun Yang, Jens Hofschulte, Wan-li Jiang, Cha Zhang. 基于光学摄像系统和惯性传感器数据融合的机器人运动跟踪系统[J]. Front. Inform. Technol. Electron. Eng., 2014, 15(7): 574-583.