Font Size: a A A

Compression Research Based On Statistical Coding

Posted on:2017-03-22Degree:MasterType:Thesis
Country:ChinaCandidate:S S HeFull Text:PDF
GTID:2347330491450968Subject:Statistics
Abstract/Summary:PDF Full Text Request
With the development of Very Large-Scale Integrated Circuits,high performance and low cost chip becomes the mainstream.On one hand,due to the increase of integrated circuits,reuse IP(Intellectual Property)core become more and more,but the bandwidth,channels number and clock frequency of the external automatic test equipment ATE(Auto Test Equipment)can not meet requirements.By improving the hardware configuration of ATE can solve the problem,but the cost is too high.On the other hand,with the development of the circuit,the number of vectors that need to be tested on the chip is increasing,which brings more challenges to the test.Test compression is one of the important ways to solve these problems.Compression test can not only remove redundant information when we compress the data,but also can improve the transfer speed of ATE data,so as to reduce the time of data transmission and the capacity of data storage.It has important significance for data processing at the era of big data.This paper makes a research on this background.First of all,this paper introduces the possible failure of the compression process,including soft fault and hard fault,then we introduces the ideal test generation technology so as to reduce the probability of failure in the test.Secondly,we analyze the traditional compression methods mainly include Golomb coding,FDR code and alternating sequential code,and other methods which base on statistical compression methods including Huffman coding,Huffman coding and nine value coding.We can achieve high compression rate and simple restoring by using statistics fixed block frequency method to compress the data flow.While the traditional data compression method is based on Zero run or runs of 1 Division of data flow,this method can only address one type of run length encoding,but in this paper we propose a method which called The exponential divide the data flow can not only address 0 or 1 run length coding,but deal with alternating jump code at the same time division.This division results different continuous or alternating blocks,then we can use a single prefix form to compress data stream.The experimental results show that the average compression rate of 62.23% using our algorithm,it is higher than other method such as Golomb code and binary encoding.Secondly,traditional code reduces according to the special marker position in reduction,but our encoding method which proposed in this paper can directly reducing the steps to restore and improve the compression time according to the prefix number calculated raw data blocks for a number.
Keywords/Search Tags:Power, Continuous, Alternating, Chip test, Coding Compression
PDF Full Text Request
Related items