The Infona portal uses cookies, i.e. strings of text saved by a browser on the user's device. The portal can access those files and use them to remember the user's data, such as their chosen settings (screen view, interface language, etc.), or their login data. By using the Infona portal the user accepts automatic saving and using this information for portal operation purposes. More information on the subject can be found in the Privacy Policy and Terms of Service. By closing this window the user confirms that they have read the information on cookie usage, and they accept the privacy policy and the way cookies are used by the portal. You can change the cookie settings in your browser.
Multiple description coding (MDC) is a powerful source coding technique that involves encoding a media stream into r independently decodeable substreams. With every successful reception of a substream, decoded signal quality improves. We consider the problem of placing a set of servers in the network such that a desired quality of service can be provided to a community of clients. We formulate the...
Let S be an n-character string terminated with an unique smallest sentinel, its suffix array SA(S) is an array of pointers for all the suffixes in S sorted in the lexicographically ascending order. Specially, the Burrows-Wheeler transform for building efficient compression solutions can be quickly computed by fast suffix sorting based on suffix array construction algorithms (SACAs). The existing well-known...
Summary form only given. The ability of the human visual system to perceive colors as approximately constant irrespective of the illuminant is called color constancy. The output of a sensor varies with the type of illuminant used and is therefore not color constant. Color constancy algorithms try to compute a color constant descriptor from the measured data (M. Ebner, 2007). We show how color constancy...
Recent publications advocate the use of various variable length codes for which each codeword consists of an integral number of bytes in compression applications using large alphabets. This paper shows that another tradeoff with similar properties can be obtained by Fibonacci codes. These are fixed codeword sets, using binary representations of integers based on Fibonacci numbers of order m ges 2...
We characterize the rate-distortion function for zero-mean stationary Gaussian sources under the MSE fidelity criterion and subject to the additional constraint that the distortion is uncorrelated to the input. The solution is given by two equations coupled through a single scalar parameter. This has a structure similar to the well known water-filling solution obtained without the uncorrelated distortion...
DCA (Data Compression using Antidictionaries) is a novel lossless data compression method working on bit streams presented by Crochemore et al. DCA takes advantage of words that do not occur as factors in the text, i.e. that are forbidden. Due to these forbidden words (antiwords), some symbols in the text can be predicted. We build the antidictionary using suffix array in time O(k * N log N), where...
We present a theoretical analysis of a perceptual coding approach, the so called Weber quantizer. Extensive studies performed by experimental psychologists and physiologists have unveiled one major conclusion: human perception often follows Weber's law. Ernst Weber was an experimental physiologist who in 1834 first discovered the following implication DeltaI = kl, where DeltaI is the so called difference...
Delivering high quality audio over the error prone wireless networks has to address many challenges. Some of these challenges include reducing the packetization overhead of small audio frames as well as protecting audio content against both bit errors introduced by wireless channels and packet erasures introduced by network buffering. We propose a statistical optimization framework for transmitting...
Let A = {a1,a2,..., a26} be a set of 26 English alphabets with probabilities P = {p1,p2,... ,p26} and C be the corresponding Huffman code that has 26 codewords {c1,c2,... ,c26} with average codeword length 4.15573. The respective lengths of the codewords are given by {lscr1,lscr2,...,lscr26}. According to P, a concatenation of 1000 symbols randomly generated from A is Huffman encoded into a bitstream...
We intuitively derive DC and AC constraints to model local block complexities using ordered Hadamard transform from pixel based gradient method. For lossless motion estimation (ME), using (1), we obtain the optimized search order in the matching error calculation by descending order of local constraints using sum of these two constraints. LBC(k) = ACSum(k) + |DCMB - DCLB(k)|.
In this work we study simple joint source-channel (JSCC) schemes for transmitting k independent Gaussian source samples with different variances over k additive white Gaussian noise (AWGN) channels. The channels shall have an average power constraint, and the distortion measure shall be the average square error. This problem arises in several practical cases, e.g. in transform coding where the output...
Interest in distributed source coding (DSC) has increased in recent years due to the development of wireless networks. In this paper we propose a solution based on a new rateless class of codes, the Raptor codes. In real applications (where the data source length and the correlation between the sources may vary), rateless codes can be naturally adapted by generating just a single codeword with suitable...
Summary form only given. Recently, a technique called bit recycling (BR) was introduced to help reduce the redundancy caused by the multiplicity of encodings. It has been used to improve LZ77 compression, which is especially prone to allow for the existence of numerous different compressed files for some given original file F. The multiplicity of encodings causes redundancy. Instead of trying to eliminate...
One major task in multiple description video coding is to prevent drift on packet loss channels, where transmission errors occur in each description. We propose a distributed multiple description video coding (DMDVC) scheme excluding any prediction loops. The new codec suffers from no drift problem. In the two-channel mode of symmetry side informations (SI), one side decoder can use the SI of the...
List update algorithms have been widely used as subroutines in compression schemas, most notably as part of Burrows-Wheeler compression. We performed an experimental comparison of various list update algorithms both as stand alone compression mechanisms and as a second stage of the BWT-based compression. We considered the following list update algorithms: move-to-front (MTF), sort-by-rank (SBR), frequency-count...
This paper proposes an audio-visual speech recognition system using SVM (support vector machine) in European and American Portuguese language. The main objective in this work is to find a model that can be used in both languages. Furthermore, two new methods to extract the mouth region (ROI-Region of interest) and lip contour are presented. Two audio and four video features are used in the experiments...
A novel object-based fractal monocular and stereo video compression scheme with quadtree- based motion and disparity compensation is proposed in this paper. Fractal coding is adopted and each object is encoded independently by a prior image segmentation alpha plane, which is defined exactly as in MPEG-4. The first n frames of right video sequence are encoded by using the circular prediction mapping...
We propose a system for audio coding using the modulated complex lapped transform (MCLT). In general, it is difficult to encode signals using overcomplete representations without avoiding a penalty in rate-distortion performance. We show that the penalty can be significantly reduced for MCLT-based representations, without the need for iterative methods of sparsity reduction. We achieve that via a...
In this paper we propose a chunkless structure for multimedia distribution over peer-to-peer networks. With the proposed approach, the original multimedia packets are partitioned into several reduced packets whose transmission requires a fraction 1/R of the bandwidth required by the original data. The generation of the reduced packets depends on a random seed that can be randomly chosen by each node...
This paper proposes IPzip, a comprehensive suite of algorithms for compressing IP network packet headers and payloads. We propose an online algorithm for compressing packets in real-time for efficient transfer and an offline algorithm for efficient storage of the network data. In contrast to related approaches, IPzip achieves better compression by exploiting the correlations exhibited by (i) packets...
Set the date range to filter the displayed results. You can set a starting date, ending date or both. You can enter the dates manually or choose them from the calendar.