Document Server@UHasselt >
Research >
Research publications >

Please use this identifier to cite or link to this item: http://hdl.handle.net/1942/23118

Title: Low-Latency Lossless Video Compression Methods for Multi-camera Systems
Authors: Zhang, Wanqiu
Stukken, Bart
Chen, Caikou
Claesen, Luc
Ouyang, Wenhan
Issue Date: 2016
Citation: ICT.OPEN 2016: The interface for Dutch ICT-Research, Amersfoort, The Netherlands, 22-23/03/2016
Abstract: A Multi-camera system is a system that combines multiple cameras in the visualization of common scenes. Applications are in omnidirectional video, stereo-, tri- and multi-ocular vision, 3D modeling and view interpolation. Standard lossy video compression algorithms are effective and provide pleasing results for the human viewer. They however often remove or alter important visual cues that are essential for multi-camera applications where detailed information for two or more cameras needs to be accurately matched (e.g. disparity calculations, plane-sweeping etc.). Therefore lossless compression is preferred in multi-camera systems. Even with lossless compression the required communication bandwidth and storage space can be reduced significantly. For many applications the compression should not impact the latency between image capture and processing too much. Also most applications require real time image processing for multi-camera video applications. Therefore, this research is focused on predictive-corrective coding filters with entropy encoding (i.e. Huffman coding) in a lossless method, and work on raw sensor data with a color filter array (i.e. Bayer pattern) to develop an efficient System-on-Chip implementation. Huffman encoding is a very complicated sequential process, and takes much time to calculate, impacting the frame delay and latency. This paper shows a method to solve this problem, by using the optimal probability density function to describe the error correction histogram. Besides, this paper also compares various probability density functions to the real values. In this way, we just need to send a few parameters instead of the whole Huffman table. After calculating all the data, the compression ratio shows that by assuming a Gaussian or a Laplace probability density function, the result is very close to the actual values, and the Laplace distribution is a better model.
URI: http://hdl.handle.net/1942/23118
Category: C2
Type: Conference Material
Appears in Collections: Research publications

Files in This Item:

Description SizeFormat
Poster277.78 kBAdobe PDF

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.