Module 4 Arithmetic Coding Prof. Hung-Ta Pai Module 4, Data Compression LISA, NTPU 1
Reals in Binary Any real number x in the interval [0, 1) can be represented in binary as .b1b2... where bi is a bit Module 4, Data Compression LISA, NTPU 2
First Conversion L:=0; R:=1; i :=1; while x > L * if x < (L+R)/2 then bi := 0; R := (L+R)/2; if x ≥ (L+R)/2 then bi := 1; L := (L+R)/2; i := i + 1; end{while} bi := 0 for all j ≥ i; * Invariant: x is always in the interval [L, R) Module 4, Data Compression LISA, NTPU 3
Basic Ideas Represent each string x of length n by a unique interval [L, R) in [0, 1) The width of the interval [L, R) represents the probability of x occurring The interval [L, R) can itself be represented by any number, called a tag, within the half open interval The k significant bits of the tag .t1t2t3.... is the code of x That is, .t1t2t3...tk000... is in the interval [L, R) Module 4, Data Compression LISA, NTPU 4
Example 1. Tag must be in the half open interval 2. Tag can be chosen to be (L+R)/2 3. Code is the significant bits of the tag Module 4, Data Compression LISA, NTPU 5
Better Tag Module 4, Data Compression LISA, NTPU 6
Example of Codes P(a) = 1/3, P(b) = 2/3 Module 4, Data Compression LISA, NTPU 7
Code Generation from Tag If binary tag is .t1t2t3... = (L+R)/2 in [L, R), then we want to choose k to form the code t1t2 ...tk Short code: choose k to be as small as possible so that L ≤ . t1t2 ...tk000... < R Guaranteed code: Choose k = ⎡log2(1/(R-L))⎤ + 1 L ≤ . t1t2 ...tkb1b2b3... < R for any bits b1b2b3... For fixed length strings provides a good prefix code Example: [.000000000..., .000010010...), tag = .000001001... Short code: 0 Guaranteed code: 000001 Module 4, Data Compression LISA, NTPU 8
Guaranteed Code Example P(a) = 1/3, P(b) = 2/3 Guaranteed code -> Prefix code Module 4, Data Compression LISA, NTPU 9
Coding Algorithm P(a1), P(a2), ..., P(am) C(ai) = P(a1) + P(a2) + ... +P(ai-1) Encode x1x2...xn Initialize L := 0; and R:=1; For i = 1 to n do W := R - L; L := L + W * C(xi); R := L + W * P(xi); end; t := (L+R)/2; choose code for the tag Module 4, Data Compression LISA, NTPU 10
Coding Example P(a) = 1/4, P(b) = 1/2, P(c) = 1/4 C(a) = 0, C(b) =1/4, C(c) = 3/4 abca Module 4, Data Compression LISA, NTPU 11
Coding Excercise P(a) = 1/4, P(b) = 1/2, P(c) = 1/4 C(a) = 0, C(b) =1/4, C(c) = 3/4 bbbb Module 4, Data Compression LISA, NTPU 12
Decoding (1/3) Assume the length is known to be 3 0001 which converts to the tag .0001000 Module 4, Data Compression LISA, NTPU 13
Decoding (2/3) Assume the length is known to be 3 0001 which converts to the tag .0001000 Module 4, Data Compression LISA, NTPU 14
Decoding (3/3) Assume the length is known to be 3 0001 which converts to the tag .0001000 Module 4, Data Compression LISA, NTPU 15
Decoding Algorithm P(a1), P(a2), ..., P(am) C(ai) = P(a1) + P(a2) + ... +P(ai-1) Decode b1b2...bm, number of symbols is n Initialize L := 0; and R:=1; t := b1b2...bm000... for i = 1 to n do W := R - L; find j such that L + W * C(aj) ≤ t < L + W * (C(aj)+P(aj)); output aj; L := L + W * C(aj); R = L + W * P(aj); Module 4, Data Compression LISA, NTPU 16
Decoding Example P(a) = 1/4, P(b) = 1/2, P(c) = 1/4 C(a) = 0, C(b) =1/4, C(c) = 3/4 00101 Module 4, Data Compression LISA, NTPU 17
Decoding Issues There are two ways for the decoder to know when to stop decoding Transmit the length of the string Transmit a unique end of string symbol Module 4, Data Compression LISA, NTPU 18
Practical Arithmetic Coding Scaling: By scaling we can keep L and R in a reasonable range of values so that W = R–L does not underflow The code can be produced progressively, not at the end Complicates decoding some Integer arithmetic coding avoids floating point altogether Module 4, Data Compression LISA, NTPU 19
Adaptation Simple solution – Equally Probable Model Initially all symbols have frequency 1 After symbol x is coded, increment its frequency by 1 Use the new model for coding the next symbol Example in alphabet a, b, c, d Module 4, Data Compression LISA, NTPU 20
Zero Frequency Problem How do we weight symbols that have not occurred yet? Equal weight? Not so good with many symbols Escape symbol, but what should its weight be? When a new symbol is encountered send the <esc>, followed by the symbol in the equally probable model (both encoded arithmetically) Module 4, Data Compression LISA, NTPU 21
End of File Problem Similar to Zero Frequency Problem Reasonable solution: Add EOF to the post-ESC equally-probable model When done compressing: First send ESC Then send EOF What’s the cost of this approach? Module 4, Data Compression LISA, NTPU 22
Arithmetic vs. Huffman Both compress very well For m symbol grouping Huffman is within 1/m of entropy Arithmetic is within 2/m of entropy Context Huffman needs a tree for every context Arithmetic needs a small table of frequencies for every context Adaptation Huffman has an elaborate adaptive algorithm Arithmetic has a simple adaptive mechanism Module 4, Data Compression LISA, NTPU 23

Module 4 Arithmetic Coding

  • 1.
    Module 4 Arithmetic Coding Prof. Hung-Ta Pai Module 4, Data Compression LISA, NTPU 1
  • 2.
    Reals in Binary Any real number x in the interval [0, 1) can be represented in binary as .b1b2... where bi is a bit Module 4, Data Compression LISA, NTPU 2
  • 3.
    First Conversion L:=0; R:=1; i :=1; while x > L * if x < (L+R)/2 then bi := 0; R := (L+R)/2; if x ≥ (L+R)/2 then bi := 1; L := (L+R)/2; i := i + 1; end{while} bi := 0 for all j ≥ i; * Invariant: x is always in the interval [L, R) Module 4, Data Compression LISA, NTPU 3
  • 4.
    Basic Ideas Represent each string x of length n by a unique interval [L, R) in [0, 1) The width of the interval [L, R) represents the probability of x occurring The interval [L, R) can itself be represented by any number, called a tag, within the half open interval The k significant bits of the tag .t1t2t3.... is the code of x That is, .t1t2t3...tk000... is in the interval [L, R) Module 4, Data Compression LISA, NTPU 4
  • 5.
    Example 1. Tag must be in the half open interval 2. Tag can be chosen to be (L+R)/2 3. Code is the significant bits of the tag Module 4, Data Compression LISA, NTPU 5
  • 6.
    Better Tag Module 4,Data Compression LISA, NTPU 6
  • 7.
    Example of Codes P(a) = 1/3, P(b) = 2/3 Module 4, Data Compression LISA, NTPU 7
  • 8.
    Code Generation fromTag If binary tag is .t1t2t3... = (L+R)/2 in [L, R), then we want to choose k to form the code t1t2 ...tk Short code: choose k to be as small as possible so that L ≤ . t1t2 ...tk000... < R Guaranteed code: Choose k = ⎡log2(1/(R-L))⎤ + 1 L ≤ . t1t2 ...tkb1b2b3... < R for any bits b1b2b3... For fixed length strings provides a good prefix code Example: [.000000000..., .000010010...), tag = .000001001... Short code: 0 Guaranteed code: 000001 Module 4, Data Compression LISA, NTPU 8
  • 9.
    Guaranteed Code Example P(a) = 1/3, P(b) = 2/3 Guaranteed code -> Prefix code Module 4, Data Compression LISA, NTPU 9
  • 10.
    Coding Algorithm P(a1), P(a2), ..., P(am) C(ai) = P(a1) + P(a2) + ... +P(ai-1) Encode x1x2...xn Initialize L := 0; and R:=1; For i = 1 to n do W := R - L; L := L + W * C(xi); R := L + W * P(xi); end; t := (L+R)/2; choose code for the tag Module 4, Data Compression LISA, NTPU 10
  • 11.
    Coding Example P(a) = 1/4, P(b) = 1/2, P(c) = 1/4 C(a) = 0, C(b) =1/4, C(c) = 3/4 abca Module 4, Data Compression LISA, NTPU 11
  • 12.
    Coding Excercise P(a) = 1/4, P(b) = 1/2, P(c) = 1/4 C(a) = 0, C(b) =1/4, C(c) = 3/4 bbbb Module 4, Data Compression LISA, NTPU 12
  • 13.
    Decoding (1/3) Assume the length is known to be 3 0001 which converts to the tag .0001000 Module 4, Data Compression LISA, NTPU 13
  • 14.
    Decoding (2/3) Assume the length is known to be 3 0001 which converts to the tag .0001000 Module 4, Data Compression LISA, NTPU 14
  • 15.
    Decoding (3/3) Assume the length is known to be 3 0001 which converts to the tag .0001000 Module 4, Data Compression LISA, NTPU 15
  • 16.
    Decoding Algorithm P(a1), P(a2), ..., P(am) C(ai) = P(a1) + P(a2) + ... +P(ai-1) Decode b1b2...bm, number of symbols is n Initialize L := 0; and R:=1; t := b1b2...bm000... for i = 1 to n do W := R - L; find j such that L + W * C(aj) ≤ t < L + W * (C(aj)+P(aj)); output aj; L := L + W * C(aj); R = L + W * P(aj); Module 4, Data Compression LISA, NTPU 16
  • 17.
    Decoding Example P(a) = 1/4, P(b) = 1/2, P(c) = 1/4 C(a) = 0, C(b) =1/4, C(c) = 3/4 00101 Module 4, Data Compression LISA, NTPU 17
  • 18.
    Decoding Issues There are two ways for the decoder to know when to stop decoding Transmit the length of the string Transmit a unique end of string symbol Module 4, Data Compression LISA, NTPU 18
  • 19.
    Practical Arithmetic Coding Scaling: By scaling we can keep L and R in a reasonable range of values so that W = R–L does not underflow The code can be produced progressively, not at the end Complicates decoding some Integer arithmetic coding avoids floating point altogether Module 4, Data Compression LISA, NTPU 19
  • 20.
    Adaptation Simple solution – Equally Probable Model Initially all symbols have frequency 1 After symbol x is coded, increment its frequency by 1 Use the new model for coding the next symbol Example in alphabet a, b, c, d Module 4, Data Compression LISA, NTPU 20
  • 21.
    Zero Frequency Problem How do we weight symbols that have not occurred yet? Equal weight? Not so good with many symbols Escape symbol, but what should its weight be? When a new symbol is encountered send the <esc>, followed by the symbol in the equally probable model (both encoded arithmetically) Module 4, Data Compression LISA, NTPU 21
  • 22.
    End of FileProblem Similar to Zero Frequency Problem Reasonable solution: Add EOF to the post-ESC equally-probable model When done compressing: First send ESC Then send EOF What’s the cost of this approach? Module 4, Data Compression LISA, NTPU 22
  • 23.
    Arithmetic vs. Huffman Both compress very well For m symbol grouping Huffman is within 1/m of entropy Arithmetic is within 2/m of entropy Context Huffman needs a tree for every context Arithmetic needs a small table of frequencies for every context Adaptation Huffman has an elaborate adaptive algorithm Arithmetic has a simple adaptive mechanism Module 4, Data Compression LISA, NTPU 23