Font Size: a A A

Research And Implementation Of Event-Based Filter And Dense Depth Estimation

Posted on:2023-04-07Degree:MasterType:Thesis
Country:ChinaCandidate:L X JingFull Text:PDF
GTID:2558307169483124Subject:Software engineering
Abstract/Summary:PDF Full Text Request
Event cameras are bio-inspired dynamic vision sensors that asynchronously measure the intensity change of each pixel instead of capturing intensity images at a fixed rate.The emerging event camera shows great potential for robotics and AR/VR application thanks to its advantages including high temporal resolution,high dynamic range,low power consumption,no motion blur,etc.The event-based depth estimation aims to provide reasonable explanations of the observed scene,which is a crucial step in highlevel robotic tasks.It is well suited for scenarios with poor lighting conditions,battery limitations,or where high-speed responses are required.However,event cameras are very sensitive to the changing environment and generate a large amount of noise.The produced event noise will affect the performance when facing different computer vision tasks.Therefore,specialized denoising algorithms need to be studied.Besides,the sparsity of event stream makes it challenging to estimate depth for each pixel.This paper carried out research on the event-based filter and dense depth estimation,and proposed an effectiveness two-stage local spatio-temporal event filter and an improving eventbased dense depth estimation with optical flow compensation.A prototype system including event filter and depth estimation is implemented.On this basis,the research results of this paper are experimentally verified.The main contributions of this paper include the following three aspects:(1)For complex scenarios with much noise conditions,a two-stage local spatiotemporal event filter based on adaptive thresholds is proposed.Considering the spatiotemporal constraints of generated event streams,a local sliding window is adopted for the noise candidate selection stage and noise filtering stage.An adaptive thresholding mechanism is also introduced into the filter in order to improve the generalization performance.Experimental evaluations are performed on the real-world datasets and results show that the proposed filter can successfully achieve the event denoising,and at the same time,effectively preserve useful scene information.Compared with the baseline methods,the proposed method achieves an 11.9% performance improvement in denoising effectiveness.(2)Aiming at the disadvantages of the existing learning-based depth estimation methods that sacrifice temporal information by image-like representation of events,a monocular event-based dense depth estimation incorporating optical flow architecture is proposed.It predicts inter-grid optical flow to compensate lost temporal correlation,then contextual depth maps are estimated and fused to generate a robust depth map corresponding to interest of events.A local smoothing loss function are designed to facilitate estimating reasonable depth for pixels where no events occur.Besides,Squeeze and excitation blocks are introduced to residual block of network to enhance the network representation by selecting the feature channels.Extensive experiments on both realworld and synthetic datasets show that the proposed architecture allows accurate depth estimation of scene detail from asynchronous event streams and shows good generalization across dataset tasks.Compared with the event-based depth estimation methods,the proposed method improves average absolute error by an average of 11.8%in day scenes and 19.0% in more challenging night scenes.(3)Based on the above research,this paper integrates the event-based noise filter with the dense depth estimation to design and implement a prototype system based on DAVIS346 event camera.The experiments are conducted on public event datasets to verify the functionality of the implemented prototype system.
Keywords/Search Tags:Depth Estimation, Event Camera, Deep Learning, Event Denoise
PDF Full Text Request
Related items