WO2008108557A1 - Entropy encoding and decoding apparatus and method based on tree structure - Google Patents

Entropy encoding and decoding apparatus and method based on tree structure Download PDF

Info

Publication number
WO2008108557A1
WO2008108557A1 PCT/KR2008/001187 KR2008001187W WO2008108557A1 WO 2008108557 A1 WO2008108557 A1 WO 2008108557A1 KR 2008001187 W KR2008001187 W KR 2008001187W WO 2008108557 A1 WO2008108557 A1 WO 2008108557A1
Authority
WO
WIPO (PCT)
Prior art keywords
codeword
symbols
grouped
detected
tree structure
Prior art date
Application number
PCT/KR2008/001187
Other languages
French (fr)
Inventor
Jung-Hoe Kim
Anton Porov
Eun-Mi Oh
Original Assignee
Samsung Electronics Co., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Electronics Co., Ltd. filed Critical Samsung Electronics Co., Ltd.
Priority to EP08723225A priority Critical patent/EP2127385A4/en
Priority to JP2009552581A priority patent/JP4865872B2/en
Publication of WO2008108557A1 publication Critical patent/WO2008108557A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/90Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using coding techniques not provided for in groups H04N19/10-H04N19/85, e.g. fractals
    • H04N19/96Tree coding, e.g. quad-tree coding
    • HELECTRICITY
    • H03ELECTRONIC CIRCUITRY
    • H03MCODING; DECODING; CODE CONVERSION IN GENERAL
    • H03M7/00Conversion of a code where information is represented by a given sequence or number of digits to a code where the same, similar or subset of information is represented by a different sequence or number of digits
    • H03M7/30Compression; Expansion; Suppression of unnecessary data, e.g. redundancy reduction
    • H03M7/40Conversion to or from variable length codes, e.g. Shannon-Fano code, Huffman code, Morse code
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/102Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
    • H04N19/13Adaptive entropy coding, e.g. adaptive variable length coding [AVLC] or context adaptive binary arithmetic coding [CABAC]

Definitions

  • the present general inventive concept relates to encoding and decoding of an audio signal or a video signal, and more particularly, to entropy encoding and decoding.
  • an input signal is processed in a predetermined method, quantized, and entropy-encoded.
  • a bitstream generated by an encoder is processed in a predetermined method, entropy-decoded, and dequantized.
  • the present general inventive concept provides an apparatus and method to perform entropy encoding and decoding based on a tree structure.
  • an entropy encoding apparatus including a codeword detector to detect a codeword corresponding to a specific symbol from among stored codewords, each stored codeword corresponding to predetermined grouped symbols, a probability detector to detect a probability value corresponding to each code of the detected codeword based on a tree structure in which an existence probability of each code is assigned to each node, a difference calculator to calculate a difference between a representative value indicated by the detected codeword and a value indicated by the specific symbol and to calculate a probability of detecting the specific symbol from among symbols contained in the detected codeword, and an arithmetic encoder to arithmetically encode the detected codeword and the calculated difference using the calculated probabilities.
  • an entropy encoding apparatus including a codeword detector to detect a codeword corresponding to a specific symbol from among stored codewords, each stored codeword corresponding to predetermined grouped symbols, a selector to select a predetermined tree structure from among a plurality of tree structures in which an existence probability of each code is differently assigned to each node, based on a context of previous symbols, a probability detector to detect a probability value corresponding to each code of the detected codeword based on the selected tree structure, a difference calculator to calculate a difference between a representative value indicated by the detected codeword and a value indicated by the specific symbol and to calculate a probability of detecting the specific symbol from among symbols contained in the detected codeword, and an arithmetic encoder to arithmetically encode the detected codeword and the calculated difference using the calculated probabilities.
  • an entropy encoding apparatus including a codeword detector to detect a codeword corresponding to a specific symbol from among stored codewords, each stored codeword corresponding to predetermined grouped symbols, a selector to select a predetermined tree structure from among a plurality of different tree structures in which an existence probability of each code is assigned to each node, based on a context of previous symbols, a probability detector to detect a probability value corresponding to each code of the detected codeword based on the selected tree structure, a difference calculator to calculate a difference between a representative value indicated by the detected codeword and a value indicated by the specific symbol and to calculate a probability of detecting the specific symbol from among symbols contained in the detected codeword, and an arithmetic encoder to arithmetically encode the detected codeword and the calculated difference using the calculated probabilities.
  • an entropy decoding apparatus including a codeword detector to detect a codeword by performing arithmetic-decoding based on a tree structure in which an existence probability of each code is assigned to each node, a difference detector to detect a difference between a representative value indicated by the detected codeword and a value indicated by a symbol specified in an encoding end by performing arithmetic-decoding, and a symbol detector to detect the specified symbol from among symbols grouped corresponding to the detected codeword using the detected difference.
  • an entropy decoding apparatus including a tree structure determiner determining a tree structure used in an encoding end from among a plurality of tree structures in which an existence probability of each code is differently assigned to each node, a codeword detector to detect a codeword by arithmetic- decoding a bitstream based on the determined tree structure, a difference detector to detect a difference between a representative value indicated by the detected codeword and a value indicated by a symbol specified in an encoding end by performing arithmetic-decoding, and a symbol detector to detect the specified symbol from among symbols grouped corresponding to the detected codeword using the detected difference.
  • an entropy decoding apparatus including a tree structure determiner to determine a tree structure used in an encoding end from among a plurality of different tree structures in which an existence probability of each code is assigned to each node, a codeword detector to detect a codeword by performing arithmetic-decoding based on the determined tree structure, a difference detector to detect a difference between a representative value indicated by the detected codeword and a value indicated by a symbol specified in an encoding end by performing arithmetic-decoding, and a symbol detector to detect the specified symbol from among symbols grouped corresponding to the detected codeword using the detected difference.
  • an entropy encoding method including detecting a codeword corresponding to a specific symbol from among stored codewords, each stored codeword corresponding to predetermined grouped symbols, detecting a probability value corresponding to each code of the detected codeword based on a tree structure in which an existence probability of each code is assigned to each node, calculating a difference between a representative value indicated by the detected codeword and a value indicated by the specific symbol and calculating a probability of detecting the specific symbol from among symbols contained in the detected codeword, and arithmetically encoding the detected codeword and the calculated difference using the calculated probabilities.
  • an entropy encoding method including detecting a codeword corresponding to a specific symbol from among stored codewords, each stored codeword corresponding to predetermined grouped symbols, selecting a predetermined tree structure from among a plurality of tree structures in which an existence probability of each code is differently assigned to each node, based on a context of previous symbols, detecting a probability value corresponding to each code of the detected codeword based on the selected tree structure, calculating a difference between a representative value indicated by the detected codeword and a value indicated by the specific symbol and calculating a probability of detecting the specific symbol from among symbols contained in the detected codeword, and arithmetically encoding the detected codeword and the calculated difference using the calculated probabilities.
  • an entropy encoding method including detecting a codeword corresponding to a specific symbol from among stored codewords, each stored codeword corresponding to predetermined grouped symbols, selecting a predetermined tree structure from among a plurality of different tree structures in which an existence probability of each code is assigned to each node, based on a context of previous symbols, detecting a probability value corresponding to each code of the detected codeword based on the selected tree structure, calculating a difference between a representative value indicated by the detected codeword and a value indicated by the specific symbol and calculating a probability of detecting the specific symbol from among symbols contained in the detected codeword, and arithmetically encoding the detected codeword and the calculated difference using the calculated probabilities.
  • an entropy decoding method including detecting a codeword by performing arithmetic-decoding based on a tree structure in which an existence probability of each code is assigned to each node, detecting a difference between a representative value indicated by the detected codeword and a value indicated by a symbol specified in an encoding end by performing arithmetic- decoding, and detecting the specified symbol from among symbols grouped corresponding to the detected codeword using the detected difference.
  • an entropy decoding method including determining a tree structure used in an encoding end from among a plurality of tree structures in which an existence probability of each code is differently assigned to each node, detecting a codeword by arithmetic-decoding a bitstream based on the determined tree structure, detecting a difference between a representative value indicated by the detected codeword and a value indicated by a symbol specified in an encoding end by performing arithmetic-decoding, and detecting the specified symbol from among symbols grouped corresponding to the detected codeword using the detected difference.
  • an entropy decoding method including determining a tree structure used in an encoding end from among a plurality of different tree structures in which an existence probability of each code is assigned to each node, detecting a codeword by arithmetic-decoding a bitstream based on the determined tree structure, detecting a difference between a representative value indicated by the detected codeword and a value indicated by a symbol specified in an encoding end by performing arithmetic-decoding, and detecting the specified symbol from among symbols grouped corresponding to the detected codeword using the detected difference.
  • a computer-readable recording medium having embodied thereon a computer program to execute a method, wherein the method includes detecting a codeword corresponding to a specific symbol from among stored codewords, each stored codeword corresponding to predetermined grouped symbols, detecting a probability value corresponding to each code of the detected codeword based on a tree structure in which an existence probability of each code is assigned to each node, calculating a difference between a representative value indicated by the detected codeword and a value indicated by the specific symbol and calculating a probability of detecting the specific symbol from among symbols contained in the detected codeword, and arithmetically encoding the detected codeword and the calculated difference using the calculated probabilities.
  • a computer-readable recording medium having embodied thereon a computer program to execute a method, wherein the method includes detecting a codeword corresponding to a specific symbol from among stored codewords, each stored codeword corresponding to predetermined grouped symbols, selecting a predetermined tree structure from among a plurality of tree structures in which an existence probability of each code is differently assigned to each node, based on a context of previous symbols, detecting a probability value corresponding to each code of the detected codeword based on the selected tree structure, calculating a difference between a representative value indicated by the detected codeword and a value indicated by the specific symbol and calculating a probability of detecting the specific symbol from among symbols contained in the detected codeword, and arithmetically encoding the detected codeword and the calculated difference using the calculated probabilities.
  • a computer-readable recording medium having embodied thereon a computer program to execute a method, wherein the method includes detecting a codeword corresponding to a specific symbol from among stored codewords, each stored codeword corresponding to predetermined grouped symbols, selecting a predetermined tree structure from among a plurality of different tree structures in which an existence probability of each code is assigned to each node, based on a context of previous symbols, detecting a probability value corresponding to each code of the detected codeword based on the selected tree structure, calculating a difference between a representative value indicated by the detected codeword and a value indicated by the specific symbol and calculating a probability of detecting the specific symbol from among symbols contained in the detected codeword, and arithmetically encoding the detected codeword and the calculated difference using the calculated probabilities.
  • a computer-readable recording medium having a computer program to execute a method, wherein the method includes detecting a codeword by performing arithmetic-decoding based on a tree structure in which an existence probability of each code is assigned to each node, detecting a difference between a representative value indicated by the detected codeword and a value indicated by a symbol specified in an encoding end by performing arithmetic- decoding, and detecting the specified symbol from among symbols grouped corresponding to the detected codeword using the detected difference.
  • a computer-readable recording medium having embodied thereon a computer program to execute a method, wherein the method includes determining a tree structure used in an encoding end from among a plurality of tree structures in which an existence probability of each code is differently assigned to each node, detecting a codeword by arithmetic-decoding a bitstream based on the determined tree structure, detecting a difference between a representative value indicated by the detected codeword and a value indicated by a symbol specified in an encoding end by performing arithmetic-decoding, and detecting the specified symbol from among symbols grouped corresponding to the detected codeword using the detected difference.
  • a computer-readable recording medium having embodied thereon a computer program to execute a method, wherein the method includes determining a tree structure used in an encoding end from among a plurality of different tree structures in which an existence probability of each code is assigned to each node, detecting a codeword by arithmetic-decoding a bitstream based on the determined tree structure, detecting a difference between a representative value indicated by the detected codeword and a value indicated by a symbol specified in an encoding end by performing arithmetic-decoding, and detecting the specified symbol from among symbols grouped corresponding to the detected codeword using the detected difference.
  • an entropy encoding apparatus including a codeword detector to detect a codeword corresponding to a respective symbol from one or more stored codewords, each stored codeword corresponding to predetermined grouped symbols, and a probability detector to detect a probability value corresponding to each code of the detected codeword based on at least one predetermined tree structure in which an existence probability of each code is assigned to each node.
  • an entropy decoding apparatus including a tree structure determiner to determine a tree structure used to encode a signal from one or more tree structures in which an existence probability of each code is differently assigned to each node, and a codeword detector to detect a codeword by arithmetic-decoding a bitstream based on the determined tree structure.
  • FIG. 1 is a block diagram illustrating an entropy encoding apparatus based on a tree structure according to an embodiment of the present general inventive concept
  • FIG. 2 is a block diagram illustrating an entropy encoding apparatus based on a tree structure according to another embodiment of the present general inventive concept
  • FIG. 3 is a block diagram illustrating an entropy encoding apparatus based on a tree structure according to another embodiment of the present general inventive concept
  • FIG. 4 illustrates that symbols are grouped with a uniform spacing
  • FIG. 5 illustrates that symbols are grouped with a non-uniform spacing
  • FIG. 6 illustrates a conceptual diagram of a tree structure
  • FIG. 7 illustrates a conceptual diagram of a plurality of tree structures
  • FIG. 8A illustrates a conceptual diagram of selecting a tree structure based on a context of a previous frame and a context of a current frame
  • FIG. 8B illustrates a conceptual diagram of selecting a tree structure based on a context of a current frame
  • FIG. 9 is a block diagram illustrating an entropy decoding apparatus based on a tree structure according to an embodiment of the present general inventive concept
  • FIG. 10 is a block diagram illustrating an entropy decoding apparatus based on a tree structure according to another embodiment of the present general inventive concept
  • FIG. 10 is a block diagram illustrating an entropy decoding apparatus based on a tree structure according to another embodiment of the present general inventive concept
  • FIG. 11 is a block diagram illustrating an entropy decoding apparatus based on a tree structure according to another embodiment of the present general inventive concept;
  • FIG. 12 is a flowchart illustrating an entropy encoding method based on a tree structure according to an embodiment of the present general inventive concept;
  • FIG. 13 is a flowchart illustrating an entropy encoding method based on a tree structure according to another embodiment of the present general inventive concept;
  • FIG. 14 is a flowchart illustrating an entropy encoding method based on a tree structure according to another embodiment of the present general inventive concept; [44] FIG.
  • FIG. 15 is a flowchart illustrating an entropy decoding method based on a tree structure according to an embodiment of the present general inventive concept
  • FIG. 16 is a flowchart illustrating an entropy decoding method based on a tree structure according to another embodiment of the present general inventive concept
  • FIG. 17 is a flowchart illustrating an entropy decoding method based on a tree structure according to another embodiment of the present general inventive concept.
  • FIG. 1 is a block diagram illustrating an entropy encoding apparatus based on a tree structure according to an embodiment of the present general inventive concept.
  • the entropy encoding apparatus includes a codeword storage unit 100, a codeword detector 105, a probability detector 110, a difference calculator 120, and an arithmetic encoder 130.
  • the codeword storage unit 100 groups predetermined symbols and stores a predetermined codeword corresponding to each group.
  • the codeword storage unit 100 can group the symbols with a uniform spacing.
  • FIG. 4 illustrates an illustration that symbols are grouped with the uniform spacing. In FIG. 4, symbols are grouped corresponding to each codeword ⁇ 2, 0, 1, 3, 15, 7, 31, 63, 23, 127, 55, 255 ⁇ with the uniform spacing of 4.
  • the codeword storage unit 100 can group the symbols with a non-uniform spacing.
  • FIG. 5 illustrates an illustration that symbols are grouped with the non-uniform spacing.
  • symbols are grouped corresponding to each codeword ⁇ 2, 0, 1, 3, 15, 7, 31, 63, 23, 127, 55, 255 ⁇ with the non-uniform spacing of ⁇ 1, 1, 1, 1, 1, 1, 1, 1, 2, 2, 32, 1 «31 ⁇ .
  • a length of each codeword is ⁇ 2, 2, 2, 3, 5, 5, 6, 7, 6, 8, 6, 8 ⁇
  • a starting symbol of symbols contained in a symbol group of each codeword is ⁇ 0, 1, 2, 3, 4, 5, 6, 7, 8, 10, 12, 44 ⁇
  • a probability assigned to each codeword is ⁇ 4973, 5684, 2580, 1243, 675, 387, 236, 158, 183, 99, 162, 3 ⁇ .
  • the spacing of 1 «31 may represent a spacing varying between 1 and 31.
  • the codeword storage unit 100 groups symbols with the non-uniform spacing, symbols contained in a duration in which probabilities significantly vary are grouped with a wide spacing, and symbols contained in a duration in which probabilities insignificantly vary are grouped with a narrow spacing. For example, symbols of which a slope of a Probability Density Function (PDF) is within a pre-set range can be grouped together.
  • PDF Probability Density Function
  • FIG. 5 while the spacing is set to 1 for each of durations 0 through 8 in which probabilities significantly vary, the spacing is set to 2 for each of durations 8 through 12 in which probabilities more significantly vary, and the spacing is set to 32 and 1 «31 for durations 12 and more in which probabilities much more significantly vary.
  • the codeword detector 105 receives a quantized symbol via an input terminal IN, searches a codeword corresponding to the quantized symbol from the codeword storage unit 100, and outputs the found codeword. As an example of FIG. 5, when a quantized symbol '10' is received via the input terminal IN, the codeword detector 105 detects a codeword '127' corresponding to the quantized symbol '10' from the codeword storage unit 100. [55] The probability detector 110 detects a probability of extracting each code of the codeword detected by the codeword detector 105 based on a tree structure in which an existence probability of each code is assigned to each node.
  • the tree structure has a plurality of nodes, each node having a pre-set probability of extracting code '0' and a pre-set probability of extracting code '1'. For example, as illustrated in FIG. 6, in each node, a probability of extracting code '0' is assigned in the upper direction, and a probability of extracting code ' 1 ' is assigned in the lower direction.
  • the probability detector 110 can calculate the probability of extracting each code of the detected codeword using a probability of extracting a code assigned to each node in the tree structure as represented by Equations 1 and 2.
  • ⁇ (*) denotes a probability of extracting
  • A(I) O 4 by use of the tree structure illustrated in FIG. 6.
  • a probability of extracting the second code T existing next to the first code is calculated using Equation 3. [62]
  • Equation 4 A probability of extracting the third code '0' existing next to the codes '11' is calculated using Equation 4.
  • ⁇ 0.4, 0.5, 0.5 ⁇ .
  • the difference calculator 120 calculates a difference between a representative value indicated by the codeword detected by the codeword detector 105 and a value indicated by the quantized symbol input via the input terminal IN.
  • the representative value indicated by the codeword is a value pre-set to the codeword to be representative of symbols contained in the codeword according to a predetermined condition, such as the smallest value or a mean value of the symbols contained in the codeword.
  • the difference calculator 120 also calculates a probability of detecting the quantized symbol input via the input terminal IN from among the symbols contained in the codeword detected by the codeword detector 105.
  • the probability of detecting the quantized symbol is a value calculated by setting the length of the uniform spacing to 1.
  • the probability of detecting the quantized symbol is a value calculated by setting the length of a spacing of a group to which the quantized symbol belongs to 1.
  • the arithmetic encoder 130 generates a bitstream by arithmetic-encoding the codeword detected by the codeword detector 105 and the difference between the representative value indicated by the detected codeword and the value indicated by the quantized symbol, which was calculated by the difference calculator 120, using the probability of extracting each code of the detected codeword, which was detected by the probability detector 110, and the probability of detecting the quantized symbol in a relevant duration, which was calculated by the difference calculator 120, and outputs the generated bitstream via an output terminal OUT.
  • FIG. 2 is a block diagram illustrating an entropy encoding apparatus based on a tree structure according to another embodiment of the present general inventive concept.
  • the entropy encoding apparatus includes a codeword storage unit 200, a codeword detector 205, a probability storage unit 210, a selector 220, a probability detector 230, a difference calculator 240, and an arithmetic encoder 250.
  • the codeword storage unit 200 groups predetermined symbols and stores a predetermined codeword corresponding to each group.
  • the codeword storage unit 200 can group the symbols with a uniform spacing.
  • FIG. 4 illustrates an illustration that symbols are grouped with the uniform spacing. In FIG. 4, symbols are grouped corresponding to each codeword ⁇ 2, 0, 1, 3, 15, 7, 31, 63, 23, 127, 55, 255 ⁇ with the uniform spacing of 4.
  • the codeword storage unit 200 can group the symbols with a non-uniform spacing.
  • FIG. 5 illustrates an illustration that symbols are grouped with the non-uniform spacing.
  • symbols are grouped corresponding to each codeword ⁇ 2, 0, 1, 3, 15, 7, 31, 63, 23, 127, 55, 255 ⁇ with non-uniform spacing of ⁇ 1, 1, 1, 1, 1, 1, 1, 1, 2, 2, 32, 1 «31 ⁇ .
  • a length of each codeword is ⁇ 2, 2, 2, 3, 5, 5, 6, 7, 6, 8, 6, 8 ⁇
  • a starting symbol of symbols contained in a symbol group of each codeword is ⁇ 0, 1, 2, 3, 4, 5, 6, 7, 8, 10, 12, 44 ⁇
  • a probability assigned to each codeword is ⁇ 4973, 5684, 2580, 1243, 675, 387, 236, 158, 183, 99, 162, 3 ⁇ .
  • the codeword storage unit 200 groups symbols with the non-uniform spacing, symbols contained in a duration in which probabilities significantly vary are grouped with a wide spacing, and symbols contained in a duration in which probabilities insignificantly vary are grouped with a narrow spacing. For example, symbols of which a slope of a PDF is within a pre-set range can be grouped together.
  • the spacing is set to 1 for each of durations 0 through 8 in which probabilities significantly vary
  • the spacing is set to 2 for each of durations 8 through 12 in which probabilities more significantly vary
  • the spacing is set to 32 and 1 «31 for durations 12 and more in which probabilities much more significantly vary.
  • the codeword detector 205 receives a quantized symbol via an input terminal IN, searches a codeword corresponding to the quantized symbol from the codeword storage unit 100, and outputs the found codeword. As an example of FIG. 5, when a quantized symbol '10' is received via the input terminal IN, the codeword detector 205 detects a codeword '127' corresponding to the quantized symbol '10' from the codeword storage unit 200.
  • the probability storage unit 210 stores a plurality of tree structures having the same structure, wherein an existence probability of each code is differently assigned to each node according to the plurality of tree structures. Accordingly, the plurality of tree structures stored in the probability storage unit 210 have the same structure in which only the existence probability of each code is differently assigned to each node.
  • the tree structure has a plurality of nodes, each node having a pre-set probability of extracting code '0' and a pre-set probability of extracting code '1'. For example, as illustrated in FIG. 7, in each node, a probability of extracting code '0' is assigned in the upper direction, and a probability of extracting code ' 1 ' is assigned in the lower direction. In FIG. 7, PDFl and PDF2 having the same tree structure in which an existence probability of each code is differently assigned to each node are stored.
  • the selector 220 selects a predetermined tree structure from among the plurality of tree structures stored in the probability storage unit 210 by analyzing a context of previously quantized symbols.
  • the predetermined tree structure can be selected by analyzing the context of quantized symbols of a previous frame and then analyzing a context of quantized symbols of a current frame. If data of the previous frame does not exist, the predetermined tree structure can be selected by analyzing only the context of the quantized symbols of the current frame.
  • FIG. 8A illustrates a conceptual diagram of selecting a tree structure by analyzing a context of a previous frame and a context of a current frame in the selector 220.
  • PC denotes a value corresponding to the context of the previous frame
  • CC denotes a value corresponding to the context of the current frame
  • the probability detector 230 detects a probability of extracting each code of the codeword detected by the codeword detector 205 using the tree structure selected by the selector 220.
  • the probability detector 230 can calculate the probability of extracting each code of the detected codeword using a probability of extracting a code assigned to each node in the selected tree structure as represented by Equations 5 and 6.
  • Equation 8 A probability of extracting the third code '0' existing next to the codes '11' is calculated using Equation 8.
  • ⁇ 0.4, 0.5, 0.5 ⁇ .
  • Equation 10 A probability of extracting the third code '0' existing next to the codes '11' is calculated using Equation 10.
  • ⁇ 0.87, 0.99, 0.30 ⁇ .
  • A(O) ⁇ ⁇ 0.4, 0.5, 0.5 ⁇ if the selector 220 selects the tree structure corresponding to PDFl, whereas the probability detector 230 outputs ⁇
  • ⁇ 0.87, 0.99, 0.30 ⁇ if the selector 220 selects the tree structure corresponding to
  • the difference calculator 240 calculates a difference between a representative value indicated by the codeword detected by the codeword detector 205 and a value indicated by the quantized symbol input via the input terminal IN.
  • the representative value indicated by the codeword is a value pre-set to the codeword to be representative of symbols contained in the codeword according to a predetermined condition, such as the smallest value or a mean value of the symbols contained in the codeword.
  • the difference calculator 240 also calculates a probability of detecting the quantized symbol input via the input terminal IN from among the symbols contained in the codeword detected by the codeword detector 205.
  • the probability of detecting the quantized symbol is a value calculated by setting the length of the uniform spacing to 1.
  • the probability of detecting the quantized symbol is a value calculated by setting the length of a spacing of a group to which the quantized symbol belongs to 1.
  • the arithmetic encoder 250 generates a bitstream by arithmetic-encoding the codeword detected by the codeword detector 205 and the difference between the representative value indicated by the detected codeword and the value indicated by the quantized symbol, which was calculated by the difference calculator 240, using the probability of extracting each code of the detected codeword, which was detected by the probability detector 230, and the probability of detecting the quantized symbol in a relevant duration, which was calculated by the difference calculator 240, and outputs the generated bitstream via an output terminal OUT.
  • FIG. 3 is a block diagram illustrating an entropy encoding apparatus based on a tree structure according to another embodiment of the present general inventive concept.
  • the entropy encoding apparatus includes a codeword storage unit 300, a codeword detector 305, a tree storage unit 310, a selector 320, a probability detector 330, a difference calculator 340, and an arithmetic encoder 350.
  • the codeword storage unit 300 groups predetermined symbols and stores a prede- termined codeword corresponding to each group.
  • FIG. 4 illustrates an illustration that symbols are grouped with the uniform spacing.
  • symbols are grouped corresponding to each codeword ⁇ 2, 0, 1, 3, 15, 7, 31, 63, 23, 127, 55, 255 ⁇ with the uniform spacing of 4.
  • the codeword storage unit 300 can group the symbols with a non-uniform spacing.
  • FIG. 5 illustrates an illustration that symbols are grouped with the non-uniform spacing.
  • symbols are grouped corresponding to each codeword ⁇ 2, 0, 1, 3, 15, 7, 31, 63, 23, 127, 55, 255 ⁇ with non-uniform spacing of ⁇ 1, 1, 1, 1, 1, 1, 1, 1, 2, 2, 32, 1 «31 ⁇ .
  • each codeword is ⁇ 2, 2, 2, 3, 5, 5, 6, 7, 6, 8, 6, 8 ⁇
  • a starting symbol of symbols contained in a symbol group of each codeword is ⁇ 0, 1, 2, 3, 4, 5, 6, 7, 8, 10, 12, 44 ⁇
  • a probability assigned to each codeword is ⁇ 4973, 5684, 2580, 1243, 675, 387, 236, 158, 183, 99, 162, 3 ⁇ .
  • the codeword storage unit 300 groups symbols with the non-uniform spacing, symbols contained in a duration in which probabilities significantly vary are grouped with a wide spacing, and symbols contained in a duration in which probabilities insignificantly vary are grouped with a narrow spacing. For example, symbols of which a slope of a PDF is within a pre-set range can be grouped together.
  • the spacing is set to 1 for each of durations 0 through 8 in which probabilities significantly vary
  • the spacing is set to 2 for each of durations 8 through 12 in which probabilities more significantly vary
  • the spacing is set to 32 and 1 «31 for durations 12 and more in which probabilities much more significantly vary.
  • the codeword detector 305 receives a quantized symbol via an input terminal IN, searches a codeword corresponding to the quantized symbol from the codeword storage unit 300, and outputs the found codeword. As an example of FIG. 5, when a quantized symbol '10' is received via the input terminal IN, the codeword detector 305 detects a codeword '127' corresponding to the quantized symbol '10' from the codeword storage unit 300.
  • the tree storage unit 310 stores a plurality of different tree structures in which an existence probability of each code is assigned to each node. While the plurality of tree structures stored in the probability storage unit 210 of FIG. 2 have the same structure in which only the existence probability of each code is differently assigned to each node, the plurality of tree structures stored in the tree storage unit 310 are different from each other in their structure.
  • the selector 320 selects a predetermined tree structure from among the plurality of tree structures stored in the tree storage unit 310 by analyzing a context of previously quantized symbols.
  • the predetermined tree structure can be selected by analyzing the context of quantized symbols of a previous frame and then analyzing a context of quantized symbols of a current frame. If data of the previous frame does not exist, the predetermined tree structure can be selected by analyzing only the context of the quantized symbols of the current frame.
  • FIG. 8 A illustrates a conceptual diagram illustrating selecting a tree structure by analyzing a context of a previous frame and a context of a current frame in the selector 320.
  • PC denotes a value corresponding to the context of the previous frame
  • CC denotes a value corresponding to the context of the current frame
  • the probability detector 330 detects a probability of extracting each code of the codeword detected by the codeword detector 305 using the tree structure selected by the selector 320.
  • the probability detector 330 can calculate the probability of extracting each code of the detected codeword using a probability of extracting a code assigned to each node in the selected tree structure as represented by Equations 11 and 12.
  • the difference calculator 340 calculates a difference between a representative value indicated by the codeword detected by the codeword detector 305 and a value indicated by the quantized symbol input via the input terminal IN.
  • the representative value indicated by the codeword is a value pre-set to the codeword to be representative of symbols contained in the codeword according to a predetermined condition, such as the smallest value or a mean value of the symbols contained in the codeword.
  • the difference calculator 340 also calculates a probability of detecting the quantized symbol input via the input terminal IN from among the symbols contained in the codeword detected by the codeword detector 305.
  • the probability of detecting the quantized symbol is a value calculated by setting the length of the uniform spacing to 1.
  • the probability of detecting the quantized symbol is a value calculated by setting the length of a spacing of a group to which the quantized symbol belongs to 1.
  • the arithmetic encoder 350 generates a bitstream by arithmetic-encoding the codeword detected by the codeword detector 305 and the difference between the representative value indicated by the detected codeword and the value indicated by the quantized symbol, which was calculated by the difference calculator 340, using the probability of extracting each code of the detected codeword, which was detected by the probability detector 330, and the probability of detecting the quantized symbol in a relevant duration, which was calculated by the difference calculator 340, and outputs the generated bitstream via an output terminal OUT.
  • FIG. 9 is a block diagram illustrating an entropy decoding apparatus based on a tree structure according to an embodiment of the present general inventive concept.
  • the entropy decoding apparatus includes an arithmetic decoder 900, a codeword detector 910, a difference detector 920, and a symbol detector 930.
  • the arithmetic decoder 900 receives a bitstream from an encoding end via an input terminal IN and performs arithmetic decoding of the bitstream.
  • the codeword detector 910 detects a codeword using a probability of detecting each code of the codeword, which is a result of the arithmetic decoding performed by the arithmetic decoder 900, based on a tree structure in which an existence probability of each code is assigned to each node.
  • the difference detector 920 detects a difference between a representative value indicated by the codeword detected by the codeword detector 910 and a value indicated by a symbol encoded in the encoding end using the result of the arithmetic decoding performed by the arithmetic decoder 900.
  • the symbol detector 930 detects a predetermined symbol from among symbols grouped corresponding to the codeword detected by the codeword detector 910 using the difference detected by the difference detector 920 and outputs the detected symbol via an output terminal OUT.
  • FIG. 10 is a block diagram of an entropy decoding apparatus based on a tree structure according to another embodiment of the present general inventive concept.
  • the entropy decoding apparatus includes an arithmetic decoder 1000, a probability storage unit 1005, a tree structure determiner 1010, a codeword detector 1020, a difference detector 1030, and a symbol detector 1040.
  • the arithmetic decoder 1000 receives a bitstream from an encoding end via an input terminal IN and performs arithmetic decoding of the bitstream.
  • the probability storage unit 1005 stores a plurality of tree structures having the same structure, wherein an existence probability of each code is differently assigned to each node according to the plurality of tree structures. Accordingly, the plurality of tree structures stored in the probability storage unit 1005 have the same structure in which only the existence probability of each code is differently assigned to each node.
  • the tree structure determiner 1010 determines a tree structure used in the encoding end from among the plurality of tree structures stored in the probability storage unit 1005. As an example of a method of determining a tree structure in the tree structure determiner 1010, the tree structure determiner 1010 can determine the tree structure used in the encoding end by receiving an index indicating the tree structure used in the encoding end from the encoding end.
  • the codeword detector 1020 detects a codeword corresponding to a probability of detecting each code of the codeword, which is a result of the arithmetic decoding performed by the arithmetic decoder 1000, using the tree structure determined by the tree structure determiner 1010.
  • the difference detector 1030 detects a difference between a representative value indicated by the codeword detected by the codeword detector 1020 and a value indicated by a symbol encoded in the encoding end using the result of the arithmetic decoding performed by the arithmetic decoder 1000.
  • the symbol detector 1040 detects a predetermined symbol from among symbols grouped corresponding to the codeword detected by the codeword detector 1020 using the difference detected by the difference detector 1030 and outputs the detected symbol via an output terminal OUT.
  • FIG. 11 is a block diagram of an entropy decoding apparatus based on a tree structure according to another embodiment of the present general inventive concept.
  • the entropy decoding apparatus includes an arithmetic decoder 1100, a tree storage unit 1105, a tree structure determiner 1110, a codeword detector 1120, a difference detector 1130, and a symbol detector 1140.
  • the arithmetic decoder 1100 receives a bitstream from an encoding end via an input terminal IN and performs arithmetic decoding of the bitstream.
  • the tree storage unit 1105 stores a plurality of different tree structures in which an existence probability of each code is assigned to each node. While the plurality of tree structures stored in the probability storage unit 1005 of FIG. 10 have the same structure in which only the existence probability of each code is differently assigned to each node, the plurality of tree structures stored in the tree storage unit 1105 are different from each other in their structure.
  • the tree structure determiner 1110 determines a tree structure used in the encoding end from among the plurality of tree structures stored in the tree storage unit 1105. As an example of a method of determining a tree structure in the tree structure determiner 1110, the tree structure determiner 1110 can determine the tree structure used in the encoding end by receiving an index indicating the tree structure used in the encoding end from the encoding end.
  • the codeword detector 1120 detects a codeword corresponding to a probability of detecting each code of the codeword, which is a result of the arithmetic decoding performed by the arithmetic decoder 1100, using the tree structure determined by the tree structure determiner 1110.
  • the difference detector 1130 detects a difference between a representative value indicated by the codeword detected by the codeword detector 1120 and a value indicated by a symbol encoded in the encoding end using the result of the arithmetic decoding performed by the arithmetic decoder 1100.
  • the symbol detector 1140 detects a predetermined symbol from among symbols grouped corresponding to the codeword detected by the codeword detector 1120 using the difference detected by the difference detector 1130 and outputs the detected symbol via an output terminal OUT.
  • FIG. 12 is a flowchart illustrating an entropy encoding method based on a tree structure according to an embodiment of the present general inventive concept.
  • a quantized symbol is received, a codeword corresponding to the quantized symbol is searched for, and the found codeword is output in operation 1200.
  • An encoding end groups predetermined symbols and stores predetermined codewords corresponding to respective groups. [140] When symbols are grouped, the symbols can be grouped with a uniform spacing.
  • FIG. 4 illustrates an illustration that symbols are grouped with the uniform spacing.
  • symbols are grouped corresponding to each codeword ⁇ 2, 0, 1, 3, 15, 7, 31, 63, 23, 127, 55, 255 ⁇ with the uniform spacing of 4.
  • FIG. 5 illustrates an illustration that symbols are grouped with the non-uniform spacing.
  • symbols are grouped corresponding to each codeword ⁇ 2, 0, 1, 3, 15, 7, 31, 63, 23, 127, 55, 255 ⁇ with the non-uniform spacing of ⁇ 1, 1, 1, 1, 1, 1, 1, 1, 2, 2, 32, 1 «31 ⁇ .
  • a length of each codeword is ⁇ 2, 2, 2, 3, 5, 5, 6, 7, 6, 8, 6, 8 ⁇
  • a starting symbol of symbols contained in a symbol group of each codeword is ⁇ 0, 1, 2, 3, 4, 5, 6, 7, 8, 10, 12, 44 ⁇
  • a probability assigned to each codeword is ⁇ 4973, 5684, 2580, 1243, 675, 387, 236, 158, 183, 99, 162, 3 ⁇ .
  • symbols contained in a duration in which probabilities significantly vary are grouped with a wide spacing
  • symbols contained in a duration in which probabilities insignificantly vary are grouped with a narrow spacing.
  • symbols of which a slope of a PDF is within a pre-set range can be grouped together.
  • the spacing is set to 1 for each of durations 0 through 8 in which probabilities significantly vary
  • the spacing is set to 2 for each of durations 8 through 12 in which probabilities more significantly vary
  • the spacing is set to 32 and 1 «31 for durations 12 and more in which probabilities much more significantly vary.
  • a probability of extracting each code of the codeword detected in operation 1200 is detected based on a tree structure in which an existence probability of each code is assigned to each node.
  • the tree structure has a plurality of nodes, each node having a pre-set probability of extracting code '0' and a pre-set probability of extracting code '1'. For example, as illustrated in FIG. 6, in each node, a probability of extracting code '0' is assigned in the upper direction, and a probability of extracting code ' 1 ' is assigned in the lower direction.
  • the probability of extracting each code of the detected codeword can be calculated using a probability of extracting a code assigned to each node in the tree structure as represented by Equations 13 and 14.
  • ⁇ (*) denotes a probability of extracting x in an i a node.
  • A(I) O 4 by use of the tree structure illustrated in FIG. 6.
  • a probability of extracting the second code T existing next to the first code is calculated using Equation 15.
  • Equation 16 A probability of extracting the third code '0' existing next to the codes '11' is calculated using Equation 16.
  • a difference between a representative value indicated by the codeword detected in operation 1200 and a value indicated by the quantized symbol input in operation 1200 is calculated.
  • the representative value indicated by the codeword is a value pre-set to the codeword to be representative of symbols contained in the codeword according to a predetermined condition, such as the smallest value or a mean value of the symbols contained in the codeword.
  • a probability of detecting the quantized symbol input in operation 1200 from among the symbols contained in the codeword detected in operation 1200 is also calculated.
  • the probability of detecting the quantized symbol is a value calculated by setting the length of the uniform spacing to 1.
  • the probability of detecting the quantized symbol is a value calculated by setting the length of a spacing of a group to which the quantized symbol belongs to 1.
  • a bitstream is generated by arithmetic-encoding the codeword detected in operation 1200 and the difference between the representative value indicated by the detected codeword and the value indicated by the quantized symbol, which was calculated in operation 1220, using the probability of extracting each code of the detected codeword, which was detected in operation 1210, and the probability of detecting the quantized symbol in a relevant duration, which was calculated in operation 1220.
  • FIG. 13 is a flowchart illustrating an entropy encoding method based on a tree structure according to another embodiment of the present general inventive concept.
  • a quantized symbol is received, a codeword corresponding to the quantized symbol is searched for, and the found codeword is output in operation 1300.
  • An encoding end groups predetermined symbols and stores predetermined codewords corresponding to respective groups.
  • FIG. 4 illustrates that symbols are grouped with the uniform spacing.
  • symbols are grouped corresponding to each codeword ⁇ 2, 0, 1, 3, 15, 7, 31, 63, 23, 127, 55, 255 ⁇ with the uniform spacing of 4.
  • FIG. 5 illustrates an illustration that symbols are grouped with the non-uniform spacing.
  • symbols are grouped corresponding to each codeword ⁇ 2, 0, 1, 3, 15, 7, 31, 63, 23, 127, 55, 255 ⁇ with the non-uniform spacing of ⁇ 1, 1, 1, 1, 1, 1, 1, 1, 2, 2, 32, 1 «31 ⁇ .
  • a length of each codeword is ⁇ 2, 2, 2, 3, 5, 5, 6, 7, 6, 8, 6, 8 ⁇
  • a starting symbol of symbols contained in a symbol group of each codeword is ⁇ 0, 1, 2, 3, 4, 5, 6, 7, 8, 10, 12, 44 ⁇
  • a probability assigned to each codeword is ⁇ 4973, 5684, 2580, 1243, 675, 387, 236, 158, 183, 99, 162, 3 ⁇ .
  • symbols of which a slope of a PDF is within a pre-set range can be grouped together.
  • the spacing is set to 1 for each of durations 0 through 8 in which probabilities significantly vary
  • the spacing is set to 2 for each of durations 8 through 12 in which probabilities more significantly vary
  • the spacing is set to 32 and 1 «31 for durations 12 and more in which probabilities much more significantly vary.
  • a predetermined tree structure is selected from among a plurality of tree structures stored in the encoding end by analyzing a context of previously quantized symbols.
  • the encoding end stores the plurality of tree structures having the same structure, wherein an existence probability of each code is differently assigned to each node according to the plurality of tree structures. Accordingly, the plurality of tree structures stored in the encoding end have the same structure in which only the existence probability of each code is differently assigned to each node.
  • the tree structure has a plurality of nodes, each node having a pre-set probability of extracting code '0' and a pre-set probability of extracting code '1'. For example, as illustrated in FIG. 7, in each node, a probability of extracting code '0' is assigned in the upper direction, and a probability of extracting code ' 1 ' is assigned in the lower direction. In FIG. 7, PDFl and PDF2 having the same tree structure in which an existence probability of each code is differently assigned to each node are stored.
  • the predetermined tree structure can be selected by analyzing the context of quantized symbols of a previous frame and then analyzing a context of quantized symbols of a current frame. If data of the previous frame does not exist, the predetermined tree structure can be selected by analyzing only the context of the quantized symbols of the current frame.
  • FIG. 8 A illustrates a conceptual diagram of selecting a tree structure by analyzing a context of a previous frame and a context of a current frame in operation 1310.
  • PC denotes a value corresponding to the context of the previous frame
  • CC denotes a value corresponding to the context of the current frame
  • the predetermined tree structure is selected in operation 1310 using only CC, which is a value of the context of the current frame, as illustrated in FIG. 8B.
  • CC which is a value of the context of the current frame, as illustrated in FIG. 8B.
  • the probability of extracting each code of the detected codeword can be calculated using a probability of extracting a code assigned to each node in the selected tree structure as represented by Equations 17 and 18.
  • A(I) O 4 by use of the tree structure illustrated in FIG. 7.
  • a probability of extracting the second code T existing next to the first code is calculated using Equation 19.
  • a probability of extracting the third code '0' existing next to the codes '11' is calculated using Equation 20. " 75 I ft ⁇ 0 ). _ *£> . £ .05
  • P 1 (I) 0 87 by use of the tree structure illustrated in FIG. 7.
  • a probability of extracting the second code T existing next to the first code is calculated using Equation 21.
  • Equation 22 A probability of extracting the third code '0' existing next to the codes '11' is calculated using Equation 22.
  • a difference between a representative value indicated by the codeword detected in operation 1300 and a value indicated by the quantized symbol input in operation 1300 is calculated.
  • the representative value indicated by the codeword is a value pre-set to the codeword to be representative of symbols contained in the codeword according to a predetermined condition, such as the smallest value or a mean value of the symbols contained in the codeword.
  • a probability of detecting the quantized symbol input in operation 1300 from among the symbols contained in the codeword detected in operation 1300 is also calculated.
  • the probability of detecting the quantized symbol is a value calculated by setting the length of the uniform spacing to 1.
  • the probability of detecting the quantized symbol is a value calculated by setting the length of a spacing of a group to which the quantized symbol belongs to 1.
  • a bitstream is generated by arithmetic-encoding the codeword detected in operation 1300 and the difference between the representative value indicated by the detected codeword and the value indicated by the quantized symbol, which was calculated in operation 1330, using the probability of extracting each code of the detected codeword, which was detected in operation 1320, and the probability of detecting the quantized symbol in a relevant duration, which was calculated in operation 1330.
  • FIG. 14 is a flowchart illustrating an entropy encoding method based on a tree structure according to another embodiment of the present general inventive concept.
  • a quantized symbol is received, a codeword corresponding to the quantized symbol is searched for, and the found codeword is output in operation 1400.
  • An encoding end groups predetermined symbols and stores predetermined codewords corresponding to respective groups.
  • FIG. 4 illustrates an illustration that symbols are grouped with uniform spacing.
  • symbols are grouped corresponding to each codeword ⁇ 2, 0, 1, 3, 15, 7, 31, 63, 23, 127, 55, 255 ⁇ with the uniform spacing of 4.
  • FIG. 5 illustrates an illustration that symbols are grouped with the non-uniform spacing.
  • symbols are grouped corresponding to each codeword ⁇ 2, 0, 1, 3, 15, 7, 31, 63, 23, 127, 55, 255 ⁇ with the non-uniform spacing of ⁇ 1, 1, 1, 1, 1, 1, 1, 1, 2, 2, 32, 1 «31 ⁇ .
  • each codeword is ⁇ 2, 2, 2, 3, 5, 5, 6, 7, 6, 8, 6, 8 ⁇
  • a starting symbol of symbols contained in a symbol group of each codeword is ⁇ 0, 1, 2, 3, 4, 5, 6, 7, 8, 10, 12, 44 ⁇
  • a probability assigned to each codeword is ⁇ 4973, 5684, 2580, 1243, 675, 387, 236, 158, 183, 99, 162, 3 ⁇ .
  • symbols contained in a duration in which probabilities significantly vary are grouped with a wide spacing
  • symbols contained in a duration in which probabilities insignificantly vary are grouped with a narrow spacing.
  • symbols of which a slope of a PDF is within a pre-set range can be grouped together.
  • the spacing is set to 1 for each of durations 0 through 8 in which probabilities significantly vary
  • the spacing is set to 2 for each of durations 8 through 12 in which probabilities more significantly vary
  • the spacing is set to 32 and 1 «31 for durations 12 and more in which probabilities much more significantly vary.
  • a predetermined tree structure is selected from among a plurality of tree structures stored in the encoding end by analyzing a context of previously quantized symbols.
  • the encoding end stores the plurality of different tree structures in which an existence probability of each code is assigned to each node. While the plurality of tree structures stored in the embodiment of FIG. 13 have the same structure in which only the existence probability of each code is differently assigned to each node, the plurality of tree structures stored in the embodiment of FIG. 14 are different from each other in their structure.
  • the predetermined tree structure can be selected by analyzing the context of quantized symbols of a previous frame and then analyzing a context of quantized symbols of a current frame. If data of the previous frame does not exist, the predetermined tree structure can be selected by analyzing only the context of the quantized symbols of the current frame.
  • FIG. 8A illustrates a conceptual diagram illustrating selecting a tree structure by analyzing a context of a previous frame and a context of a current frame in operation 1410.
  • PC denotes a value corresponding to the context of the previous frame
  • CC denotes a value corresponding to the context of the current frame
  • the probability of extracting each code of the detected codeword can be calculated using a probability of extracting a code assigned to each node in the selected tree structure as represented by Equations 24 and 25.
  • a difference between a representative value indicated by the codeword detected in operation 1400 and a value indicated by the quantized symbol input in operation 1400 is calculated.
  • the representative value indicated by the codeword is a value pre-set to the codeword to be representative of symbols contained in the codeword according to a predetermined condition, such as the smallest value or a mean value of the symbols contained in the codeword.
  • a probability of detecting the quantized symbol input in operation 1400 from among the symbols contained in the codeword detected in operation 1400 is also calculated.
  • the probability of detecting the quantized symbol is a value calculated by setting the length of the uniform spacing to 1.
  • the probability of detecting the quantized symbol is a value calculated by setting the length of a spacing of a group to which the quantized symbol belongs to 1.
  • a bitstream is generated by arithmetic-encoding the codeword detected in operation 1400 and the difference between the representative value indicated by the detected codeword and the value indicated by the quantized symbol, which was calculated in operation 1430, using the probability of extracting each code of the detected codeword, which was detected in operation 1420, and the probability of detecting the quantized symbol in a relevant duration, which was calculated in operation 1430.
  • FIG. 15 is a flowchart illustrating an entropy decoding method based on a tree structure according to an embodiment of the present general inventive concept.
  • bitstream is received from an encoding end and arithmetic- decoded in operation 1500.
  • a codeword is detected in operation 1510 using a probability of detecting each code of the codeword, which is a result of the arithmetic decoding performed in operation 1500, based on a tree structure in which an existence probability of each code is assigned to each node.
  • operation 1520 a difference between a representative value indicated by the codeword detected in operation 1510 and a value indicated by a symbol encoded in the encoding end is detected using the result of the arithmetic decoding performed in operation 1500.
  • a predetermined symbol is detected from among symbols grouped corresponding to the codeword detected in operation 1510 using the difference detected in operation 1520.
  • FIG. 16 is a flowchart illustrating an entropy decoding method based on a tree structure according to another embodiment of the present general inventive concept. [209] Referring to FIG. 16, a bitstream is received from an encoding end and arithmetic- decoded in operation 1600.
  • a tree structure used in the encoding end is determined from among a plurality of tree structures stored in a decoding end.
  • the decoding end stores a plurality of tree structures having the same structure, wherein an existence probability of each code is differently assigned to each node according to the plurality of tree structures. Accordingly, the plurality of tree structures stored in the decoding end have the same structure in which only the existence probability of each code is differently assigned to each node.
  • the tree structure used in the encoding end can be determined by receiving an index indicating the tree structure used in the encoding end from the encoding end.
  • a codeword corresponding to a probability of detecting each code of the codeword which is a result of the arithmetic decoding performed in operation 1600, is detected using the tree structure determined in operation 1610.
  • a predetermined symbol is detected from among symbols grouped corresponding to the codeword detected in operation 1620 using the difference detected in operation 1630.
  • FIG. 17 is a flowchart illustrating an entropy decoding method based on a tree structure according to another embodiment of the present general inventive concept.
  • bitstream is received from an encoding end and arithmetic- decoded in operation 1700.
  • a tree structure used in the encoding end is determined from among a plurality of tree structures stored in a decoding end.
  • the decoding end stores a plurality of different tree structures in which an existence probability of each code is assigned to each node. While the plurality of tree structures stored in the decoding end in the embodiment of FIG. 16 have the same structure in which only the existence probability of each code is differently assigned to each node, the plurality of tree structures stored in the decoding end in the embodiment of FIG. 17 are different from each other in their structure.
  • the tree structure used in the encoding end can be determined by receiving an index indicating the tree structure used in the encoding end from the encoding end.
  • a predetermined symbol is detected from among symbols grouped corresponding to the codeword detected in operation 1720 using the difference detected in operation 1730.
  • the general inventive concept can also be embodied as computer-readable codes on a computer-readable medium.
  • the computer-readable medium can include a computer- readable recording medium and a computer-readable transmission medium.
  • the computer readable recording medium is any data storage device that can store data which can be thereafter read by a computer system. Examples of the computer readable recording medium include read-only memory (ROM), random- access memory (RAM), CD-ROMs, magnetic tapes, floppy disks, and optical data storage devices.
  • the computer-readable recording medium can also be distributed over network coupled computer systems so that the computer-readable code is stored and executed in a distributed fashion.
  • the computer-readable transmission medium can transmit carrier waves or signals (e.g., wired or wireless data transmission through the Internet). Also, functional programs, codes, and code segments to accomplish the present general inventive concept can be easily construed by programmers skilled in the art to which the present general inventive concept pertains.
  • entropy encoding decoding are performed based on a tree structure. By doing this, coding efficiency is increased, complexity is reduced, and additional information can be reduced. In addition, even when the present general inventive concept is applied to scalable coding, coding efficiency can be prevented from being decreased.

Abstract

Encoding and decoding of an audio signal or a video signal. By performing entropy encoding and decoding based on a tree structure, coding efficiency is increased, complexity is reduced, and additional information can be reduced.

Description

Description ENTROPY ENCODING AND DECODING APPARATUS AND
METHOD BASED ON TREE STRUCTURE
Technical Field
[1] The present general inventive concept relates to encoding and decoding of an audio signal or a video signal, and more particularly, to entropy encoding and decoding. Background Art
[2] When an audio signal or a video signal is encoded, an input signal is processed in a predetermined method, quantized, and entropy-encoded. Alternatively, when an audio signal or a video signal is decoded, a bitstream generated by an encoder is processed in a predetermined method, entropy-decoded, and dequantized.
[3] If the entropy encoding and the entropy decoding are performed according to the prior art, coding efficiency is low and complexity is high. For example, in the case of Advanced Audio Coding (AAC), since additional information, such as "section_data", of an entropy coding output range, an amount of additional data in coding increases. For example, in order to encode a signal having the size of 1024, 1024 codebooks are needed, requiring much memory space and considerable complexity. Disclosure of Invention Technical Solution
[4] The present general inventive concept provides an apparatus and method to perform entropy encoding and decoding based on a tree structure.
[5] Additional aspects and utilities of the present general inventive concept will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the general inventive concept.
[6] The foregoing and/or other aspects and utilities of the general inventive concept may be achieved by providing an entropy encoding apparatus including a codeword detector to detect a codeword corresponding to a specific symbol from among stored codewords, each stored codeword corresponding to predetermined grouped symbols, a probability detector to detect a probability value corresponding to each code of the detected codeword based on a tree structure in which an existence probability of each code is assigned to each node, a difference calculator to calculate a difference between a representative value indicated by the detected codeword and a value indicated by the specific symbol and to calculate a probability of detecting the specific symbol from among symbols contained in the detected codeword, and an arithmetic encoder to arithmetically encode the detected codeword and the calculated difference using the calculated probabilities. [7] The foregoing and/or other aspects and utilities of the general inventive concept may also be achieved by providing an entropy encoding apparatus including a codeword detector to detect a codeword corresponding to a specific symbol from among stored codewords, each stored codeword corresponding to predetermined grouped symbols, a selector to select a predetermined tree structure from among a plurality of tree structures in which an existence probability of each code is differently assigned to each node, based on a context of previous symbols, a probability detector to detect a probability value corresponding to each code of the detected codeword based on the selected tree structure, a difference calculator to calculate a difference between a representative value indicated by the detected codeword and a value indicated by the specific symbol and to calculate a probability of detecting the specific symbol from among symbols contained in the detected codeword, and an arithmetic encoder to arithmetically encode the detected codeword and the calculated difference using the calculated probabilities.
[8] The foregoing and/or other aspects and utilities of the general inventive concept may also be achieved by providing an entropy encoding apparatus including a codeword detector to detect a codeword corresponding to a specific symbol from among stored codewords, each stored codeword corresponding to predetermined grouped symbols, a selector to select a predetermined tree structure from among a plurality of different tree structures in which an existence probability of each code is assigned to each node, based on a context of previous symbols, a probability detector to detect a probability value corresponding to each code of the detected codeword based on the selected tree structure, a difference calculator to calculate a difference between a representative value indicated by the detected codeword and a value indicated by the specific symbol and to calculate a probability of detecting the specific symbol from among symbols contained in the detected codeword, and an arithmetic encoder to arithmetically encode the detected codeword and the calculated difference using the calculated probabilities.
[9] The foregoing and/or other aspects and utilities of the general inventive concept may also be achieved by providing an entropy decoding apparatus including a codeword detector to detect a codeword by performing arithmetic-decoding based on a tree structure in which an existence probability of each code is assigned to each node, a difference detector to detect a difference between a representative value indicated by the detected codeword and a value indicated by a symbol specified in an encoding end by performing arithmetic-decoding, and a symbol detector to detect the specified symbol from among symbols grouped corresponding to the detected codeword using the detected difference.
[10] The foregoing and/or other aspects and utilities of the general inventive concept may also be achieved by providing an entropy decoding apparatus including a tree structure determiner determining a tree structure used in an encoding end from among a plurality of tree structures in which an existence probability of each code is differently assigned to each node, a codeword detector to detect a codeword by arithmetic- decoding a bitstream based on the determined tree structure, a difference detector to detect a difference between a representative value indicated by the detected codeword and a value indicated by a symbol specified in an encoding end by performing arithmetic-decoding, and a symbol detector to detect the specified symbol from among symbols grouped corresponding to the detected codeword using the detected difference.
[11] The foregoing and/or other aspects and utilities of the general inventive concept may also be achieved by providing an entropy decoding apparatus including a tree structure determiner to determine a tree structure used in an encoding end from among a plurality of different tree structures in which an existence probability of each code is assigned to each node, a codeword detector to detect a codeword by performing arithmetic-decoding based on the determined tree structure, a difference detector to detect a difference between a representative value indicated by the detected codeword and a value indicated by a symbol specified in an encoding end by performing arithmetic-decoding, and a symbol detector to detect the specified symbol from among symbols grouped corresponding to the detected codeword using the detected difference.
[12] The foregoing and/or other aspects and utilities of the general inventive concept may also be achieved by providing an entropy encoding method including detecting a codeword corresponding to a specific symbol from among stored codewords, each stored codeword corresponding to predetermined grouped symbols, detecting a probability value corresponding to each code of the detected codeword based on a tree structure in which an existence probability of each code is assigned to each node, calculating a difference between a representative value indicated by the detected codeword and a value indicated by the specific symbol and calculating a probability of detecting the specific symbol from among symbols contained in the detected codeword, and arithmetically encoding the detected codeword and the calculated difference using the calculated probabilities.
[13] The foregoing and/or other aspects and utilities of the general inventive concept may also be achieved by providing an entropy encoding method including detecting a codeword corresponding to a specific symbol from among stored codewords, each stored codeword corresponding to predetermined grouped symbols, selecting a predetermined tree structure from among a plurality of tree structures in which an existence probability of each code is differently assigned to each node, based on a context of previous symbols, detecting a probability value corresponding to each code of the detected codeword based on the selected tree structure, calculating a difference between a representative value indicated by the detected codeword and a value indicated by the specific symbol and calculating a probability of detecting the specific symbol from among symbols contained in the detected codeword, and arithmetically encoding the detected codeword and the calculated difference using the calculated probabilities.
[14] The foregoing and/or other aspects and utilities of the general inventive concept may also be achieved by providing an entropy encoding method including detecting a codeword corresponding to a specific symbol from among stored codewords, each stored codeword corresponding to predetermined grouped symbols, selecting a predetermined tree structure from among a plurality of different tree structures in which an existence probability of each code is assigned to each node, based on a context of previous symbols, detecting a probability value corresponding to each code of the detected codeword based on the selected tree structure, calculating a difference between a representative value indicated by the detected codeword and a value indicated by the specific symbol and calculating a probability of detecting the specific symbol from among symbols contained in the detected codeword, and arithmetically encoding the detected codeword and the calculated difference using the calculated probabilities.
[15] The foregoing and/or other aspects and utilities of the general inventive concept may also be achieved by providing an entropy decoding method including detecting a codeword by performing arithmetic-decoding based on a tree structure in which an existence probability of each code is assigned to each node, detecting a difference between a representative value indicated by the detected codeword and a value indicated by a symbol specified in an encoding end by performing arithmetic- decoding, and detecting the specified symbol from among symbols grouped corresponding to the detected codeword using the detected difference.
[16] The foregoing and/or other aspects and utilities of the general inventive concept may also be achieved by providing an entropy decoding method including determining a tree structure used in an encoding end from among a plurality of tree structures in which an existence probability of each code is differently assigned to each node, detecting a codeword by arithmetic-decoding a bitstream based on the determined tree structure, detecting a difference between a representative value indicated by the detected codeword and a value indicated by a symbol specified in an encoding end by performing arithmetic-decoding, and detecting the specified symbol from among symbols grouped corresponding to the detected codeword using the detected difference.
[17] The foregoing and/or other aspects and utilities of the general inventive concept may also be achieved by providing an entropy decoding method including determining a tree structure used in an encoding end from among a plurality of different tree structures in which an existence probability of each code is assigned to each node, detecting a codeword by arithmetic-decoding a bitstream based on the determined tree structure, detecting a difference between a representative value indicated by the detected codeword and a value indicated by a symbol specified in an encoding end by performing arithmetic-decoding, and detecting the specified symbol from among symbols grouped corresponding to the detected codeword using the detected difference.
[18] The foregoing and/or other aspects and utilities of the general inventive concept may also be achieved by providing a computer-readable recording medium having embodied thereon a computer program to execute a method, wherein the method includes detecting a codeword corresponding to a specific symbol from among stored codewords, each stored codeword corresponding to predetermined grouped symbols, detecting a probability value corresponding to each code of the detected codeword based on a tree structure in which an existence probability of each code is assigned to each node, calculating a difference between a representative value indicated by the detected codeword and a value indicated by the specific symbol and calculating a probability of detecting the specific symbol from among symbols contained in the detected codeword, and arithmetically encoding the detected codeword and the calculated difference using the calculated probabilities.
[19] The foregoing and/or other aspects and utilities of the general inventive concept may also be achieved by providing a computer-readable recording medium having embodied thereon a computer program to execute a method, wherein the method includes detecting a codeword corresponding to a specific symbol from among stored codewords, each stored codeword corresponding to predetermined grouped symbols, selecting a predetermined tree structure from among a plurality of tree structures in which an existence probability of each code is differently assigned to each node, based on a context of previous symbols, detecting a probability value corresponding to each code of the detected codeword based on the selected tree structure, calculating a difference between a representative value indicated by the detected codeword and a value indicated by the specific symbol and calculating a probability of detecting the specific symbol from among symbols contained in the detected codeword, and arithmetically encoding the detected codeword and the calculated difference using the calculated probabilities.
[20] The foregoing and/or other aspects and utilities of the general inventive concept may also be achieved by providing a computer-readable recording medium having embodied thereon a computer program to execute a method, wherein the method includes detecting a codeword corresponding to a specific symbol from among stored codewords, each stored codeword corresponding to predetermined grouped symbols, selecting a predetermined tree structure from among a plurality of different tree structures in which an existence probability of each code is assigned to each node, based on a context of previous symbols, detecting a probability value corresponding to each code of the detected codeword based on the selected tree structure, calculating a difference between a representative value indicated by the detected codeword and a value indicated by the specific symbol and calculating a probability of detecting the specific symbol from among symbols contained in the detected codeword, and arithmetically encoding the detected codeword and the calculated difference using the calculated probabilities.
[21] The foregoing and/or other aspects and utilities of the general inventive concept may also be achieved by providing a computer-readable recording medium having a computer program to execute a method, wherein the method includes detecting a codeword by performing arithmetic-decoding based on a tree structure in which an existence probability of each code is assigned to each node, detecting a difference between a representative value indicated by the detected codeword and a value indicated by a symbol specified in an encoding end by performing arithmetic- decoding, and detecting the specified symbol from among symbols grouped corresponding to the detected codeword using the detected difference.
[22] The foregoing and/or other aspects and utilities of the general inventive concept may also be achieved by providing a computer-readable recording medium having embodied thereon a computer program to execute a method, wherein the method includes determining a tree structure used in an encoding end from among a plurality of tree structures in which an existence probability of each code is differently assigned to each node, detecting a codeword by arithmetic-decoding a bitstream based on the determined tree structure, detecting a difference between a representative value indicated by the detected codeword and a value indicated by a symbol specified in an encoding end by performing arithmetic-decoding, and detecting the specified symbol from among symbols grouped corresponding to the detected codeword using the detected difference.
[23] The foregoing and/or other aspects and utilities of the general inventive concept may also be achieved by providing a computer-readable recording medium having embodied thereon a computer program to execute a method, wherein the method includes determining a tree structure used in an encoding end from among a plurality of different tree structures in which an existence probability of each code is assigned to each node, detecting a codeword by arithmetic-decoding a bitstream based on the determined tree structure, detecting a difference between a representative value indicated by the detected codeword and a value indicated by a symbol specified in an encoding end by performing arithmetic-decoding, and detecting the specified symbol from among symbols grouped corresponding to the detected codeword using the detected difference.
[24] The foregoing and/or other aspects and utilities of the general inventive concept may also be achieved by providing an entropy encoding apparatus, including a codeword detector to detect a codeword corresponding to a respective symbol from one or more stored codewords, each stored codeword corresponding to predetermined grouped symbols, and a probability detector to detect a probability value corresponding to each code of the detected codeword based on at least one predetermined tree structure in which an existence probability of each code is assigned to each node.
[25] The foregoing and/or other aspects and utilities of the general inventive concept may also be achieved by providing an entropy decoding apparatus, including a tree structure determiner to determine a tree structure used to encode a signal from one or more tree structures in which an existence probability of each code is differently assigned to each node, and a codeword detector to detect a codeword by arithmetic-decoding a bitstream based on the determined tree structure.
[26] The foregoing and/or other aspects and utilities of the general inventive concept may also be achieved by providing a method of encoding a signal, the method including detecting a codeword corresponding to a respective symbol from one or more stored codewords in which each stored codeword corresponds to predetermined grouped symbols, and detecting a probability value corresponding to each code of the detected codeword based on at least one predetermined tree structure in which an existence probability of each code is assigned to each node.
[27] The foregoing and/or other aspects and utilities of the general inventive concept may also be achieved by providing a method of decoding a signal, the method including determining a tree structure used to encode a signal from one or more tree structures in which an existence probability of each code is differently assigned to each node, and detecting a codeword by arithmetic-decoding a bitstream based on the determined tree structure. Description of Drawings
[28] These and/or other aspects and utilities of the present general inventive concept will become apparent and more readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
[29] FIG. 1 is a block diagram illustrating an entropy encoding apparatus based on a tree structure according to an embodiment of the present general inventive concept;
[30] FIG. 2 is a block diagram illustrating an entropy encoding apparatus based on a tree structure according to another embodiment of the present general inventive concept; [31] FIG. 3 is a block diagram illustrating an entropy encoding apparatus based on a tree structure according to another embodiment of the present general inventive concept; [32] FIG. 4 illustrates that symbols are grouped with a uniform spacing;
[33] FIG. 5 illustrates that symbols are grouped with a non-uniform spacing;
[34] FIG. 6 illustrates a conceptual diagram of a tree structure;
[35] FIG. 7 illustrates a conceptual diagram of a plurality of tree structures;
[36] FIG. 8A illustrates a conceptual diagram of selecting a tree structure based on a context of a previous frame and a context of a current frame; [37] FIG. 8B illustrates a conceptual diagram of selecting a tree structure based on a context of a current frame; [38] FIG. 9 is a block diagram illustrating an entropy decoding apparatus based on a tree structure according to an embodiment of the present general inventive concept; [39] FIG. 10 is a block diagram illustrating an entropy decoding apparatus based on a tree structure according to another embodiment of the present general inventive concept; [40] FIG. 11 is a block diagram illustrating an entropy decoding apparatus based on a tree structure according to another embodiment of the present general inventive concept; [41] FIG. 12 is a flowchart illustrating an entropy encoding method based on a tree structure according to an embodiment of the present general inventive concept; [42] FIG. 13 is a flowchart illustrating an entropy encoding method based on a tree structure according to another embodiment of the present general inventive concept; [43] FIG. 14 is a flowchart illustrating an entropy encoding method based on a tree structure according to another embodiment of the present general inventive concept; [44] FIG. 15 is a flowchart illustrating an entropy decoding method based on a tree structure according to an embodiment of the present general inventive concept; [45] FIG. 16 is a flowchart illustrating an entropy decoding method based on a tree structure according to another embodiment of the present general inventive concept; and [46] FIG. 17 is a flowchart illustrating an entropy decoding method based on a tree structure according to another embodiment of the present general inventive concept.
Mode for Invention [47] Hereinafter, the present general inventive concept will now be described in detail by explaining preferred embodiments of the general inventive concept with reference to the attached drawings. [48] Reference will now be made in detail to the embodiments of the present general inventive concept, examples of which are illustrated in the accompanying drawings, wherein like reference numerals refer to the like elements throughout. The embodiments are described below in order to explain the present general inventive concept by referring to the figures. [49] FIG. 1 is a block diagram illustrating an entropy encoding apparatus based on a tree structure according to an embodiment of the present general inventive concept. Referring to FIG. 1, the entropy encoding apparatus includes a codeword storage unit 100, a codeword detector 105, a probability detector 110, a difference calculator 120, and an arithmetic encoder 130.
[50] The codeword storage unit 100 groups predetermined symbols and stores a predetermined codeword corresponding to each group.
[51] When the codeword storage unit 100 groups symbols, the codeword storage unit 100 can group the symbols with a uniform spacing. FIG. 4 illustrates an illustration that symbols are grouped with the uniform spacing. In FIG. 4, symbols are grouped corresponding to each codeword {2, 0, 1, 3, 15, 7, 31, 63, 23, 127, 55, 255} with the uniform spacing of 4.
[52] Alternatively, when the codeword storage unit 100 groups symbols, the codeword storage unit 100 can group the symbols with a non-uniform spacing. FIG. 5 illustrates an illustration that symbols are grouped with the non-uniform spacing. In FIG. 5, symbols are grouped corresponding to each codeword {2, 0, 1, 3, 15, 7, 31, 63, 23, 127, 55, 255} with the non-uniform spacing of { 1, 1, 1, 1, 1, 1, 1, 1, 2, 2, 32, 1«31 }. Accordingly, a length of each codeword is {2, 2, 2, 3, 5, 5, 6, 7, 6, 8, 6, 8}, a starting symbol of symbols contained in a symbol group of each codeword is {0, 1, 2, 3, 4, 5, 6, 7, 8, 10, 12, 44}, and a probability assigned to each codeword is {4973, 5684, 2580, 1243, 675, 387, 236, 158, 183, 99, 162, 3}. Here, the spacing of 1«31 may represent a spacing varying between 1 and 31.
[53] When the codeword storage unit 100 groups symbols with the non-uniform spacing, symbols contained in a duration in which probabilities significantly vary are grouped with a wide spacing, and symbols contained in a duration in which probabilities insignificantly vary are grouped with a narrow spacing. For example, symbols of which a slope of a Probability Density Function (PDF) is within a pre-set range can be grouped together. In FIG. 5, while the spacing is set to 1 for each of durations 0 through 8 in which probabilities significantly vary, the spacing is set to 2 for each of durations 8 through 12 in which probabilities more significantly vary, and the spacing is set to 32 and 1«31 for durations 12 and more in which probabilities much more significantly vary.
[54] The codeword detector 105 receives a quantized symbol via an input terminal IN, searches a codeword corresponding to the quantized symbol from the codeword storage unit 100, and outputs the found codeword. As an example of FIG. 5, when a quantized symbol '10' is received via the input terminal IN, the codeword detector 105 detects a codeword '127' corresponding to the quantized symbol '10' from the codeword storage unit 100. [55] The probability detector 110 detects a probability of extracting each code of the codeword detected by the codeword detector 105 based on a tree structure in which an existence probability of each code is assigned to each node. The tree structure has a plurality of nodes, each node having a pre-set probability of extracting code '0' and a pre-set probability of extracting code '1'. For example, as illustrated in FIG. 6, in each node, a probability of extracting code '0' is assigned in the upper direction, and a probability of extracting code ' 1 ' is assigned in the lower direction.
[56] The probability detector 110 can calculate the probability of extracting each code of the detected codeword using a probability of extracting a code assigned to each node in the tree structure as represented by Equations 1 and 2.
Figure imgf000011_0001
(1) [58] Here,
denotes a probability of extracting
in an
node.
[59] Λ-,(l)
* (!) =
(2) [60] Here,
Λ (*) denotes a probability of extracting
in an
node.
[61] As an example of FIG. 6, when a codeword '110' is detected by the codeword detector 105, a probability of extracting the first code T is detected as
A(I)= O 4 by use of the tree structure illustrated in FIG. 6. A probability of extracting the second code T existing next to the first code is calculated using Equation 3. [62]
Figure imgf000012_0001
(3)
[63] [0001] A probability of extracting the third code '0' existing next to the codes '11' is calculated using Equation 4.
Figure imgf000012_0002
(4)
[65] Thus, when the codeword detector 105 detects the codeword '110', the probability detector 110 detects {
Λ (l)
A(O)
}={0.4, 0.5, 0.5}.
[66] The difference calculator 120 calculates a difference between a representative value indicated by the codeword detected by the codeword detector 105 and a value indicated by the quantized symbol input via the input terminal IN. The representative value indicated by the codeword is a value pre-set to the codeword to be representative of symbols contained in the codeword according to a predetermined condition, such as the smallest value or a mean value of the symbols contained in the codeword.
[67] The difference calculator 120 also calculates a probability of detecting the quantized symbol input via the input terminal IN from among the symbols contained in the codeword detected by the codeword detector 105. When symbols are grouped with the uniform spacing as illustrated in FIG. 4, the probability of detecting the quantized symbol is a value calculated by setting the length of the uniform spacing to 1. Alternatively, when symbols are grouped with the non-uniform spacing as illustrated in FIG. 5, the probability of detecting the quantized symbol is a value calculated by setting the length of a spacing of a group to which the quantized symbol belongs to 1.
[68] The arithmetic encoder 130 generates a bitstream by arithmetic-encoding the codeword detected by the codeword detector 105 and the difference between the representative value indicated by the detected codeword and the value indicated by the quantized symbol, which was calculated by the difference calculator 120, using the probability of extracting each code of the detected codeword, which was detected by the probability detector 110, and the probability of detecting the quantized symbol in a relevant duration, which was calculated by the difference calculator 120, and outputs the generated bitstream via an output terminal OUT.
[69] FIG. 2 is a block diagram illustrating an entropy encoding apparatus based on a tree structure according to another embodiment of the present general inventive concept. Referring to FIG. 2, the entropy encoding apparatus includes a codeword storage unit 200, a codeword detector 205, a probability storage unit 210, a selector 220, a probability detector 230, a difference calculator 240, and an arithmetic encoder 250.
[70] The codeword storage unit 200 groups predetermined symbols and stores a predetermined codeword corresponding to each group.
[71] When the codeword storage unit 200 groups symbols, the codeword storage unit 200 can group the symbols with a uniform spacing. FIG. 4 illustrates an illustration that symbols are grouped with the uniform spacing. In FIG. 4, symbols are grouped corresponding to each codeword {2, 0, 1, 3, 15, 7, 31, 63, 23, 127, 55, 255} with the uniform spacing of 4.
[72] Alternatively, when the codeword storage unit 200 groups symbols, the codeword storage unit 200 can group the symbols with a non-uniform spacing. FIG. 5 illustrates an illustration that symbols are grouped with the non-uniform spacing. In FIG. 5, symbols are grouped corresponding to each codeword {2, 0, 1, 3, 15, 7, 31, 63, 23, 127, 55, 255} with non-uniform spacing of { 1, 1, 1, 1, 1, 1, 1, 1, 2, 2, 32, 1«31 }. Accordingly, a length of each codeword is {2, 2, 2, 3, 5, 5, 6, 7, 6, 8, 6, 8}, a starting symbol of symbols contained in a symbol group of each codeword is {0, 1, 2, 3, 4, 5, 6, 7, 8, 10, 12, 44}, and a probability assigned to each codeword is {4973, 5684, 2580, 1243, 675, 387, 236, 158, 183, 99, 162, 3}.
[73] When the codeword storage unit 200 groups symbols with the non-uniform spacing, symbols contained in a duration in which probabilities significantly vary are grouped with a wide spacing, and symbols contained in a duration in which probabilities insignificantly vary are grouped with a narrow spacing. For example, symbols of which a slope of a PDF is within a pre-set range can be grouped together. In FIG. 5, while the spacing is set to 1 for each of durations 0 through 8 in which probabilities significantly vary, the spacing is set to 2 for each of durations 8 through 12 in which probabilities more significantly vary, and the spacing is set to 32 and 1«31 for durations 12 and more in which probabilities much more significantly vary.
[74] The codeword detector 205 receives a quantized symbol via an input terminal IN, searches a codeword corresponding to the quantized symbol from the codeword storage unit 100, and outputs the found codeword. As an example of FIG. 5, when a quantized symbol '10' is received via the input terminal IN, the codeword detector 205 detects a codeword '127' corresponding to the quantized symbol '10' from the codeword storage unit 200. [75] The probability storage unit 210 stores a plurality of tree structures having the same structure, wherein an existence probability of each code is differently assigned to each node according to the plurality of tree structures. Accordingly, the plurality of tree structures stored in the probability storage unit 210 have the same structure in which only the existence probability of each code is differently assigned to each node.
[76] The tree structure has a plurality of nodes, each node having a pre-set probability of extracting code '0' and a pre-set probability of extracting code '1'. For example, as illustrated in FIG. 7, in each node, a probability of extracting code '0' is assigned in the upper direction, and a probability of extracting code ' 1 ' is assigned in the lower direction. In FIG. 7, PDFl and PDF2 having the same tree structure in which an existence probability of each code is differently assigned to each node are stored.
[77] The selector 220 selects a predetermined tree structure from among the plurality of tree structures stored in the probability storage unit 210 by analyzing a context of previously quantized symbols.
[78] When the selector 220 analyzes the context of the previously quantized symbols, the predetermined tree structure can be selected by analyzing the context of quantized symbols of a previous frame and then analyzing a context of quantized symbols of a current frame. If data of the previous frame does not exist, the predetermined tree structure can be selected by analyzing only the context of the quantized symbols of the current frame.
[79] FIG. 8A illustrates a conceptual diagram of selecting a tree structure by analyzing a context of a previous frame and a context of a current frame in the selector 220. In FIG. 8A, PC denotes a value corresponding to the context of the previous frame, CC denotes a value corresponding to the context of the current frame, and ID denotes an identification number of each tree structure stored in the probability storage unit 210. If the selector 220 detects that PC=O and CC=I, the selector 220 selects a tree structure having ID ' 1 ' from among the plurality of tree structures stored in the probability storage unit 210. However, if data of the previous frame does not exist, the selector 220 selects a predetermined tree structure from the probability storage unit 210 using only CC, which is a value of the context of the current frame, as illustrated in FIG. 8B.
[80] The probability detector 230 detects a probability of extracting each code of the codeword detected by the codeword detector 205 using the tree structure selected by the selector 220.
[81] The probability detector 230 can calculate the probability of extracting each code of the detected codeword using a probability of extracting a code assigned to each node in the selected tree structure as represented by Equations 5 and 6.
1821 P,(o)= P-M
Λ-i(°)+ Λ-,(l) (5) [83] Here,
denotes a probability of extracting x in an ia node.
[84]
Λ (I) = Λ-,(l)
(6) [85] Here,
denotes a probability of extracting z in an i" node.
[86] As an example of FIG. 7, when a codeword '110' is detected by the codeword detector 205, and when a tree structure corresponding to PDFl is selected by the selector 220, a probability of extracting the first code T is detected as A(I)= O 4 by use of the tree structure illustrated in FIG. 7. A probability of extracting the second code T existing next to the first code is calculated using Equation 7.
Figure imgf000015_0001
(7)
[88] A probability of extracting the third code '0' existing next to the codes '11' is calculated using Equation 8.
Λ(0). _ ^W - £i -05 Pi Q) + PiV) 0 2
(8)
[90] Thus, when the codeword detector 205 detects the codeword '110', the probability detector 230 detects { Λ (I)
A(O)
}={0.4, 0.5, 0.5}.
[91] However, when the codeword '110' is detected by the codeword detector 205, and when a tree structure corresponding to PDF2 is selected by the selector 220, a probability of extracting the first code T is detected as
Figure imgf000016_0001
by use of the tree structure illustrated in FIG. 7. A probability of extracting the second code T existing next to the first code is calculated using Equation 9.
PΛ } Λ(θ) + Λ(l) 0 37
(9)
[93] A probability of extracting the third code '0' existing next to the codes '11' is calculated using Equation 10.
l K f P2{0) + P3[I) 0 99
(10)
[95] Thus, when the codeword detector 205 detects the codeword '110', the probability detector 230 detects {
Λ (l)
PI [O)
}={0.87, 0.99, 0.30}.
[96] Thus, even when the codeword detector 205 detects the same codeword '110' and the probability storage unit 210 stores the same tree structures, since different probabilities are assigned to each node, the probability detector 230 outputs {
^1(I)
A(O) }={0.4, 0.5, 0.5} if the selector 220 selects the tree structure corresponding to PDFl, whereas the probability detector 230 outputs {
ft (l)
A(O)
}={0.87, 0.99, 0.30} if the selector 220 selects the tree structure corresponding to
PDF2.
[97] The difference calculator 240 calculates a difference between a representative value indicated by the codeword detected by the codeword detector 205 and a value indicated by the quantized symbol input via the input terminal IN. The representative value indicated by the codeword is a value pre-set to the codeword to be representative of symbols contained in the codeword according to a predetermined condition, such as the smallest value or a mean value of the symbols contained in the codeword.
[98] The difference calculator 240 also calculates a probability of detecting the quantized symbol input via the input terminal IN from among the symbols contained in the codeword detected by the codeword detector 205. When symbols are grouped with the uniform spacing as illustrated in FIG. 4, the probability of detecting the quantized symbol is a value calculated by setting the length of the uniform spacing to 1. Alternatively, when symbols are grouped with the non-uniform spacing as illustrated in FIG. 5, the probability of detecting the quantized symbol is a value calculated by setting the length of a spacing of a group to which the quantized symbol belongs to 1.
[99] The arithmetic encoder 250 generates a bitstream by arithmetic-encoding the codeword detected by the codeword detector 205 and the difference between the representative value indicated by the detected codeword and the value indicated by the quantized symbol, which was calculated by the difference calculator 240, using the probability of extracting each code of the detected codeword, which was detected by the probability detector 230, and the probability of detecting the quantized symbol in a relevant duration, which was calculated by the difference calculator 240, and outputs the generated bitstream via an output terminal OUT.
[100] FIG. 3 is a block diagram illustrating an entropy encoding apparatus based on a tree structure according to another embodiment of the present general inventive concept. Referring to FIG. 3, the entropy encoding apparatus includes a codeword storage unit 300, a codeword detector 305, a tree storage unit 310, a selector 320, a probability detector 330, a difference calculator 340, and an arithmetic encoder 350.
[101] The codeword storage unit 300 groups predetermined symbols and stores a prede- termined codeword corresponding to each group.
[102] When the codeword storage unit 300 groups symbols, the codeword storage unit 300 can group the symbols with a uniform spacing. FIG. 4 illustrates an illustration that symbols are grouped with the uniform spacing. In FIG. 4, symbols are grouped corresponding to each codeword {2, 0, 1, 3, 15, 7, 31, 63, 23, 127, 55, 255} with the uniform spacing of 4.
[103] Alternatively, when the codeword storage unit 300 groups symbols, the codeword storage unit 300 can group the symbols with a non-uniform spacing. FIG. 5 illustrates an illustration that symbols are grouped with the non-uniform spacing. In FIG. 5, symbols are grouped corresponding to each codeword {2, 0, 1, 3, 15, 7, 31, 63, 23, 127, 55, 255} with non-uniform spacing of { 1, 1, 1, 1, 1, 1, 1, 1, 2, 2, 32, 1«31 }. Accordingly, the length of each codeword is {2, 2, 2, 3, 5, 5, 6, 7, 6, 8, 6, 8}, a starting symbol of symbols contained in a symbol group of each codeword is {0, 1, 2, 3, 4, 5, 6, 7, 8, 10, 12, 44}, and a probability assigned to each codeword is {4973, 5684, 2580, 1243, 675, 387, 236, 158, 183, 99, 162, 3}.
[104] When the codeword storage unit 300 groups symbols with the non-uniform spacing, symbols contained in a duration in which probabilities significantly vary are grouped with a wide spacing, and symbols contained in a duration in which probabilities insignificantly vary are grouped with a narrow spacing. For example, symbols of which a slope of a PDF is within a pre-set range can be grouped together. In FIG. 5, while the spacing is set to 1 for each of durations 0 through 8 in which probabilities significantly vary, the spacing is set to 2 for each of durations 8 through 12 in which probabilities more significantly vary, and the spacing is set to 32 and 1«31 for durations 12 and more in which probabilities much more significantly vary.
[105] The codeword detector 305 receives a quantized symbol via an input terminal IN, searches a codeword corresponding to the quantized symbol from the codeword storage unit 300, and outputs the found codeword. As an example of FIG. 5, when a quantized symbol '10' is received via the input terminal IN, the codeword detector 305 detects a codeword '127' corresponding to the quantized symbol '10' from the codeword storage unit 300.
[106] The tree storage unit 310 stores a plurality of different tree structures in which an existence probability of each code is assigned to each node. While the plurality of tree structures stored in the probability storage unit 210 of FIG. 2 have the same structure in which only the existence probability of each code is differently assigned to each node, the plurality of tree structures stored in the tree storage unit 310 are different from each other in their structure.
[107] The selector 320 selects a predetermined tree structure from among the plurality of tree structures stored in the tree storage unit 310 by analyzing a context of previously quantized symbols.
[108] When the selector 320 analyzes the context of the previously quantized symbols, the predetermined tree structure can be selected by analyzing the context of quantized symbols of a previous frame and then analyzing a context of quantized symbols of a current frame. If data of the previous frame does not exist, the predetermined tree structure can be selected by analyzing only the context of the quantized symbols of the current frame.
[109] FIG. 8 A illustrates a conceptual diagram illustrating selecting a tree structure by analyzing a context of a previous frame and a context of a current frame in the selector 320. In FIG. 8A, PC denotes a value corresponding to the context of the previous frame, CC denotes a value corresponding to the context of the current frame, and ID denotes an identification number of each tree structure stored in the tree storage unit 310. If the selector 320 detects that PC=O and CC=I, the selector 320 selects a tree structure having ID ' 1 ' from among the plurality of tree structures stored in the tree storage unit 310. However, if data of the previous frame does not exist, the selector 320 selects a predetermined tree structure from the tree storage unit 310 using only CC, which is a value of the context of the current frame, as illustrated in FIG. 8B.
[110] The probability detector 330 detects a probability of extracting each code of the codeword detected by the codeword detector 305 using the tree structure selected by the selector 320.
[I l l] The probability detector 330 can calculate the probability of extracting each code of the detected codeword using a probability of extracting a code assigned to each node in the selected tree structure as represented by Equations 11 and 12.
[1 12I y _> ft vvΛ/ _ - Λ-i(°)
/Vi(O)+ Λ-ι O)
(H)
[113] Here,
denotes a probability of extracting
X in an
node.
Figure imgf000019_0001
(12)
[115] Here, denotes a probability of extracting T: in an ιtk node.
[116] The difference calculator 340 calculates a difference between a representative value indicated by the codeword detected by the codeword detector 305 and a value indicated by the quantized symbol input via the input terminal IN. The representative value indicated by the codeword is a value pre-set to the codeword to be representative of symbols contained in the codeword according to a predetermined condition, such as the smallest value or a mean value of the symbols contained in the codeword.
[117] The difference calculator 340 also calculates a probability of detecting the quantized symbol input via the input terminal IN from among the symbols contained in the codeword detected by the codeword detector 305. When symbols are grouped with the uniform spacing as illustrated in FIG. 4, the probability of detecting the quantized symbol is a value calculated by setting the length of the uniform spacing to 1. Alternatively, when symbols are grouped with the non-uniform spacing as illustrated in FIG. 5, the probability of detecting the quantized symbol is a value calculated by setting the length of a spacing of a group to which the quantized symbol belongs to 1.
[118] The arithmetic encoder 350 generates a bitstream by arithmetic-encoding the codeword detected by the codeword detector 305 and the difference between the representative value indicated by the detected codeword and the value indicated by the quantized symbol, which was calculated by the difference calculator 340, using the probability of extracting each code of the detected codeword, which was detected by the probability detector 330, and the probability of detecting the quantized symbol in a relevant duration, which was calculated by the difference calculator 340, and outputs the generated bitstream via an output terminal OUT.
[119] FIG. 9 is a block diagram illustrating an entropy decoding apparatus based on a tree structure according to an embodiment of the present general inventive concept. Referring to FIG. 9, the entropy decoding apparatus includes an arithmetic decoder 900, a codeword detector 910, a difference detector 920, and a symbol detector 930.
[120] The arithmetic decoder 900 receives a bitstream from an encoding end via an input terminal IN and performs arithmetic decoding of the bitstream.
[121] The codeword detector 910 detects a codeword using a probability of detecting each code of the codeword, which is a result of the arithmetic decoding performed by the arithmetic decoder 900, based on a tree structure in which an existence probability of each code is assigned to each node.
[122] The difference detector 920 detects a difference between a representative value indicated by the codeword detected by the codeword detector 910 and a value indicated by a symbol encoded in the encoding end using the result of the arithmetic decoding performed by the arithmetic decoder 900.
[123] The symbol detector 930 detects a predetermined symbol from among symbols grouped corresponding to the codeword detected by the codeword detector 910 using the difference detected by the difference detector 920 and outputs the detected symbol via an output terminal OUT.
[124] FIG. 10 is a block diagram of an entropy decoding apparatus based on a tree structure according to another embodiment of the present general inventive concept. Referring to FIG. 10, the entropy decoding apparatus includes an arithmetic decoder 1000, a probability storage unit 1005, a tree structure determiner 1010, a codeword detector 1020, a difference detector 1030, and a symbol detector 1040.
[125] The arithmetic decoder 1000 receives a bitstream from an encoding end via an input terminal IN and performs arithmetic decoding of the bitstream.
[126] The probability storage unit 1005 stores a plurality of tree structures having the same structure, wherein an existence probability of each code is differently assigned to each node according to the plurality of tree structures. Accordingly, the plurality of tree structures stored in the probability storage unit 1005 have the same structure in which only the existence probability of each code is differently assigned to each node.
[127] The tree structure determiner 1010 determines a tree structure used in the encoding end from among the plurality of tree structures stored in the probability storage unit 1005. As an example of a method of determining a tree structure in the tree structure determiner 1010, the tree structure determiner 1010 can determine the tree structure used in the encoding end by receiving an index indicating the tree structure used in the encoding end from the encoding end.
[128] The codeword detector 1020 detects a codeword corresponding to a probability of detecting each code of the codeword, which is a result of the arithmetic decoding performed by the arithmetic decoder 1000, using the tree structure determined by the tree structure determiner 1010.
[129] The difference detector 1030 detects a difference between a representative value indicated by the codeword detected by the codeword detector 1020 and a value indicated by a symbol encoded in the encoding end using the result of the arithmetic decoding performed by the arithmetic decoder 1000.
[130] The symbol detector 1040 detects a predetermined symbol from among symbols grouped corresponding to the codeword detected by the codeword detector 1020 using the difference detected by the difference detector 1030 and outputs the detected symbol via an output terminal OUT.
[131] FIG. 11 is a block diagram of an entropy decoding apparatus based on a tree structure according to another embodiment of the present general inventive concept. Referring to FIG. 11, the entropy decoding apparatus includes an arithmetic decoder 1100, a tree storage unit 1105, a tree structure determiner 1110, a codeword detector 1120, a difference detector 1130, and a symbol detector 1140.
[132] The arithmetic decoder 1100 receives a bitstream from an encoding end via an input terminal IN and performs arithmetic decoding of the bitstream.
[133] The tree storage unit 1105 stores a plurality of different tree structures in which an existence probability of each code is assigned to each node. While the plurality of tree structures stored in the probability storage unit 1005 of FIG. 10 have the same structure in which only the existence probability of each code is differently assigned to each node, the plurality of tree structures stored in the tree storage unit 1105 are different from each other in their structure.
[134] The tree structure determiner 1110 determines a tree structure used in the encoding end from among the plurality of tree structures stored in the tree storage unit 1105. As an example of a method of determining a tree structure in the tree structure determiner 1110, the tree structure determiner 1110 can determine the tree structure used in the encoding end by receiving an index indicating the tree structure used in the encoding end from the encoding end.
[135] The codeword detector 1120 detects a codeword corresponding to a probability of detecting each code of the codeword, which is a result of the arithmetic decoding performed by the arithmetic decoder 1100, using the tree structure determined by the tree structure determiner 1110.
[136] The difference detector 1130 detects a difference between a representative value indicated by the codeword detected by the codeword detector 1120 and a value indicated by a symbol encoded in the encoding end using the result of the arithmetic decoding performed by the arithmetic decoder 1100.
[137] The symbol detector 1140 detects a predetermined symbol from among symbols grouped corresponding to the codeword detected by the codeword detector 1120 using the difference detected by the difference detector 1130 and outputs the detected symbol via an output terminal OUT.
[138] FIG. 12 is a flowchart illustrating an entropy encoding method based on a tree structure according to an embodiment of the present general inventive concept.
[139] Referring to FIG. 12, a quantized symbol is received, a codeword corresponding to the quantized symbol is searched for, and the found codeword is output in operation 1200. An encoding end groups predetermined symbols and stores predetermined codewords corresponding to respective groups. [140] When symbols are grouped, the symbols can be grouped with a uniform spacing.
FIG. 4 illustrates an illustration that symbols are grouped with the uniform spacing. In FIG. 4, symbols are grouped corresponding to each codeword {2, 0, 1, 3, 15, 7, 31, 63, 23, 127, 55, 255} with the uniform spacing of 4.
[141] Alternatively, when symbols are grouped, the symbols can be grouped with a nonuniform spacing. FIG. 5 illustrates an illustration that symbols are grouped with the non-uniform spacing. In FIG. 5, symbols are grouped corresponding to each codeword {2, 0, 1, 3, 15, 7, 31, 63, 23, 127, 55, 255} with the non-uniform spacing of { 1, 1, 1, 1, 1, 1, 1, 1, 2, 2, 32, 1«31 }. Accordingly, a length of each codeword is {2, 2, 2, 3, 5, 5, 6, 7, 6, 8, 6, 8}, a starting symbol of symbols contained in a symbol group of each codeword is {0, 1, 2, 3, 4, 5, 6, 7, 8, 10, 12, 44}, and a probability assigned to each codeword is {4973, 5684, 2580, 1243, 675, 387, 236, 158, 183, 99, 162, 3}.
[142] When the symbols are grouped with the non-uniform spacing, symbols contained in a duration in which probabilities significantly vary are grouped with a wide spacing, and symbols contained in a duration in which probabilities insignificantly vary are grouped with a narrow spacing. For example, symbols of which a slope of a PDF is within a pre-set range can be grouped together. In FIG. 5, while the spacing is set to 1 for each of durations 0 through 8 in which probabilities significantly vary, the spacing is set to 2 for each of durations 8 through 12 in which probabilities more significantly vary, and the spacing is set to 32 and 1«31 for durations 12 and more in which probabilities much more significantly vary.
[143] In operation 1210, a probability of extracting each code of the codeword detected in operation 1200 is detected based on a tree structure in which an existence probability of each code is assigned to each node. The tree structure has a plurality of nodes, each node having a pre-set probability of extracting code '0' and a pre-set probability of extracting code '1'. For example, as illustrated in FIG. 6, in each node, a probability of extracting code '0' is assigned in the upper direction, and a probability of extracting code ' 1 ' is assigned in the lower direction.
[144] In operation 1210, the probability of extracting each code of the detected codeword can be calculated using a probability of extracting a code assigned to each node in the tree structure as represented by Equations 13 and 14.
[145] We) - ^(0)
(13) [146] Here,
Λ (*) denotes a probability of extracting x in an ia node.
[147] Λ-,(l)
>. (!) =
(14) [148] Here,
denotes a probability of extracting
in an ia node.
[149] As an example of FIG. 6, when a codeword '110' is detected in operation 1200 , a probability of extracting the first code T is detected as
A(I)= O 4 by use of the tree structure illustrated in FIG. 6. A probability of extracting the second code T existing next to the first code is calculated using Equation 15.
C150] ft(0= »fi> = 0 2 = 0 5
P* U P1(O) + P1(^ 0 4
(15)
[151] A probability of extracting the third code '0' existing next to the codes '11' is calculated using Equation 16.
"* A(o).. *fi!) . « 1 0 5
(16) [153] Thus, when the codeword '110' is detected in operation 1200, {
*(0
Λ (I)
}={0.4, 0.5, 0.5} is detected in operation 1210. [154] In operation 1220, a difference between a representative value indicated by the codeword detected in operation 1200 and a value indicated by the quantized symbol input in operation 1200 is calculated. The representative value indicated by the codeword is a value pre-set to the codeword to be representative of symbols contained in the codeword according to a predetermined condition, such as the smallest value or a mean value of the symbols contained in the codeword.
[155] In operation 1220, a probability of detecting the quantized symbol input in operation 1200 from among the symbols contained in the codeword detected in operation 1200 is also calculated. When symbols are grouped with the uniform spacing as illustrated in FIG. 4, the probability of detecting the quantized symbol is a value calculated by setting the length of the uniform spacing to 1. Alternatively, when symbols are grouped with the non-uniform spacing as illustrated in FIG. 5, the probability of detecting the quantized symbol is a value calculated by setting the length of a spacing of a group to which the quantized symbol belongs to 1.
[156] In operation 1230, a bitstream is generated by arithmetic-encoding the codeword detected in operation 1200 and the difference between the representative value indicated by the detected codeword and the value indicated by the quantized symbol, which was calculated in operation 1220, using the probability of extracting each code of the detected codeword, which was detected in operation 1210, and the probability of detecting the quantized symbol in a relevant duration, which was calculated in operation 1220.
[157] FIG. 13 is a flowchart illustrating an entropy encoding method based on a tree structure according to another embodiment of the present general inventive concept.
[158] Referring to FIG. 13, a quantized symbol is received, a codeword corresponding to the quantized symbol is searched for, and the found codeword is output in operation 1300. An encoding end groups predetermined symbols and stores predetermined codewords corresponding to respective groups.
[159] When symbols are grouped, the symbols can be grouped with a uniform spacing. FIG. 4 illustrates that symbols are grouped with the uniform spacing. In FIG. 4, symbols are grouped corresponding to each codeword {2, 0, 1, 3, 15, 7, 31, 63, 23, 127, 55, 255} with the uniform spacing of 4.
[160] Alternatively, when symbols are grouped, the symbols can be grouped with a nonuniform spacing. FIG. 5 illustrates an illustration that symbols are grouped with the non-uniform spacing. In FIG. 5, symbols are grouped corresponding to each codeword {2, 0, 1, 3, 15, 7, 31, 63, 23, 127, 55, 255} with the non-uniform spacing of { 1, 1, 1, 1, 1, 1, 1, 1, 2, 2, 32, 1«31 }. Accordingly, a length of each codeword is {2, 2, 2, 3, 5, 5, 6, 7, 6, 8, 6, 8}, a starting symbol of symbols contained in a symbol group of each codeword is {0, 1, 2, 3, 4, 5, 6, 7, 8, 10, 12, 44}, and a probability assigned to each codeword is {4973, 5684, 2580, 1243, 675, 387, 236, 158, 183, 99, 162, 3}. [161] When the symbols are grouped with the non-uniform spacing, symbols contained in a duration in which probabilities significantly vary are grouped with a wide spacing, and symbols contained in a duration in which probabilities insignificantly vary are grouped with a narrow spacing. For example, symbols of which a slope of a PDF is within a pre-set range can be grouped together. In FIG. 5, while the spacing is set to 1 for each of durations 0 through 8 in which probabilities significantly vary, the spacing is set to 2 for each of durations 8 through 12 in which probabilities more significantly vary, and the spacing is set to 32 and 1«31 for durations 12 and more in which probabilities much more significantly vary.
[162] In operation 1310, a predetermined tree structure is selected from among a plurality of tree structures stored in the encoding end by analyzing a context of previously quantized symbols. The encoding end stores the plurality of tree structures having the same structure, wherein an existence probability of each code is differently assigned to each node according to the plurality of tree structures. Accordingly, the plurality of tree structures stored in the encoding end have the same structure in which only the existence probability of each code is differently assigned to each node.
[163] The tree structure has a plurality of nodes, each node having a pre-set probability of extracting code '0' and a pre-set probability of extracting code '1'. For example, as illustrated in FIG. 7, in each node, a probability of extracting code '0' is assigned in the upper direction, and a probability of extracting code ' 1 ' is assigned in the lower direction. In FIG. 7, PDFl and PDF2 having the same tree structure in which an existence probability of each code is differently assigned to each node are stored.
[164] When the context of the previously quantized symbols is analyzed in operation 1310, the predetermined tree structure can be selected by analyzing the context of quantized symbols of a previous frame and then analyzing a context of quantized symbols of a current frame. If data of the previous frame does not exist, the predetermined tree structure can be selected by analyzing only the context of the quantized symbols of the current frame.
[165] FIG. 8 A illustrates a conceptual diagram of selecting a tree structure by analyzing a context of a previous frame and a context of a current frame in operation 1310. In FIG. 8A, PC denotes a value corresponding to the context of the previous frame, CC denotes a value corresponding to the context of the current frame, and ID denotes an identification number of each tree structure stored in the probability storage unit 210. If it is detected in operation 1310 that PC=O and CC=I, a tree structure having ID '1' is selected in operation 1310 from among the plurality of tree structures stored in the encoding end. However, if data of the previous frame does not exist, the predetermined tree structure is selected in operation 1310 using only CC, which is a value of the context of the current frame, as illustrated in FIG. 8B. [166] In operation 1320, a probability of extracting each code of the codeword detected in operation 1300 using the tree structure selected in operation 1310.
[167] In operation 1320, the probability of extracting each code of the detected codeword can be calculated using a probability of extracting a code assigned to each node in the selected tree structure as represented by Equations 17 and 18.
11681 Λ(o)=. *»
(17)
[169] Here,
denotes a probability of extracting
in an
node.
Figure imgf000027_0001
(18) [171] Here,
denotes a probability of extracting
in an ιlh node.
[172] As an example of FIG. 7, when a codeword '110' is detected in operation 1300, and when a tree structure corresponding to PDFl is selected in operation 1310, a probability of extracting the first code T is detected as
A(I)= O 4 by use of the tree structure illustrated in FIG. 7. A probability of extracting the second code T existing next to the first code is calculated using Equation 19.
Λ U P1[O) + P1[T) 0 4
(19)
[174] A probability of extracting the third code '0' existing next to the codes '11' is calculated using Equation 20. "75I ft{0). _ *£> . £ .05
A(°) + AO) 0 2
(20) [176] Thus, when the codeword '110' is detected in operation 1300, {
Figure imgf000028_0001
A(O)
}={0.4, 0.5, 0.5} is detected in operation 1320.
[177] However, when the codeword '110' is detected in operation 1300, and when a tree structure corresponding to PDF2 is selected in operation 1310, a probability of extracting the first code T is detected as
P1(I)= 0 87 by use of the tree structure illustrated in FIG. 7. A probability of extracting the second code T existing next to the first code is calculated using Equation 21.
Figure imgf000028_0002
(21)
[179] A probability of extracting the third code '0' existing next to the codes '11' is calculated using Equation 22.
Figure imgf000028_0003
(22) [181] Thus, when the codeword '110' is detected in operation 1300, {
ft (l)
A(O)
}={0.87, 0.99, 0.30} is detected in operation 1320. [182] Thus, even when the same codeword '110' is detected in operation 1300 and the encoding end stores the same tree structures, since different probabilities are assigned to each node, {
A(O Λ (I)
}={0.4, 0.5, 0.5} is output in operation 1320 if the tree structure corresponding to PDFl is selected in operation 1310, whereas {
}={0.87, 0.99, 0.30} is output in operation 1320 if the tree structure corresponding to PDF2 is selected in operation 1310.
[183] In operation 1330, a difference between a representative value indicated by the codeword detected in operation 1300 and a value indicated by the quantized symbol input in operation 1300 is calculated. The representative value indicated by the codeword is a value pre-set to the codeword to be representative of symbols contained in the codeword according to a predetermined condition, such as the smallest value or a mean value of the symbols contained in the codeword.
[184] In operation 1330, a probability of detecting the quantized symbol input in operation 1300 from among the symbols contained in the codeword detected in operation 1300 is also calculated. When symbols are grouped with the uniform spacing as illustrated in FIG. 4, the probability of detecting the quantized symbol is a value calculated by setting the length of the uniform spacing to 1. Alternatively, when symbols are grouped with the non-uniform spacing as illustrated in FIG. 5, the probability of detecting the quantized symbol is a value calculated by setting the length of a spacing of a group to which the quantized symbol belongs to 1.
[185] In operation 1340, a bitstream is generated by arithmetic-encoding the codeword detected in operation 1300 and the difference between the representative value indicated by the detected codeword and the value indicated by the quantized symbol, which was calculated in operation 1330, using the probability of extracting each code of the detected codeword, which was detected in operation 1320, and the probability of detecting the quantized symbol in a relevant duration, which was calculated in operation 1330.
[186] FIG. 14 is a flowchart illustrating an entropy encoding method based on a tree structure according to another embodiment of the present general inventive concept. [187] Referring to FIG. 14, a quantized symbol is received, a codeword corresponding to the quantized symbol is searched for, and the found codeword is output in operation 1400. An encoding end groups predetermined symbols and stores predetermined codewords corresponding to respective groups.
[188] When symbols are grouped, the symbols can be grouped with a uniform spacing. FIG. 4 illustrates an illustration that symbols are grouped with uniform spacing. In FIG. 4, symbols are grouped corresponding to each codeword {2, 0, 1, 3, 15, 7, 31, 63, 23, 127, 55, 255} with the uniform spacing of 4.
[189] Alternatively, when symbols are grouped, the symbols can be grouped with nonuniform spacing. FIG. 5 illustrates an illustration that symbols are grouped with the non-uniform spacing. In FIG. 5, symbols are grouped corresponding to each codeword {2, 0, 1, 3, 15, 7, 31, 63, 23, 127, 55, 255} with the non-uniform spacing of { 1, 1, 1, 1, 1, 1, 1, 1, 2, 2, 32, 1«31 }. Accordingly, the length of each codeword is {2, 2, 2, 3, 5, 5, 6, 7, 6, 8, 6, 8}, a starting symbol of symbols contained in a symbol group of each codeword is {0, 1, 2, 3, 4, 5, 6, 7, 8, 10, 12, 44}, and a probability assigned to each codeword is {4973, 5684, 2580, 1243, 675, 387, 236, 158, 183, 99, 162, 3}.
[190] When the symbols are grouped with the non-uniform spacing, symbols contained in a duration in which probabilities significantly vary are grouped with a wide spacing, and symbols contained in a duration in which probabilities insignificantly vary are grouped with a narrow spacing. For example, symbols of which a slope of a PDF is within a pre-set range can be grouped together. In FIG. 5, while the spacing is set to 1 for each of durations 0 through 8 in which probabilities significantly vary, the spacing is set to 2 for each of durations 8 through 12 in which probabilities more significantly vary, and the spacing is set to 32 and 1«31 for durations 12 and more in which probabilities much more significantly vary.
[191] In operation 1410, a predetermined tree structure is selected from among a plurality of tree structures stored in the encoding end by analyzing a context of previously quantized symbols. The encoding end stores the plurality of different tree structures in which an existence probability of each code is assigned to each node. While the plurality of tree structures stored in the embodiment of FIG. 13 have the same structure in which only the existence probability of each code is differently assigned to each node, the plurality of tree structures stored in the embodiment of FIG. 14 are different from each other in their structure.
[192] When the context of the previously quantized symbols is analyzed in operation 1410, the predetermined tree structure can be selected by analyzing the context of quantized symbols of a previous frame and then analyzing a context of quantized symbols of a current frame. If data of the previous frame does not exist, the predetermined tree structure can be selected by analyzing only the context of the quantized symbols of the current frame.
[193] FIG. 8A illustrates a conceptual diagram illustrating selecting a tree structure by analyzing a context of a previous frame and a context of a current frame in operation 1410. In FIG. 8 A, PC denotes a value corresponding to the context of the previous frame, CC denotes a value corresponding to the context of the current frame, and ID denotes an identification number of each tree structure stored in the probability storage unit 210. If it is detected in operation 1410 that PC=O and CC=I, a tree structure having ID '1' is selected in operation 1410 from among the plurality of tree structures stored in the encoding end. However, if data of the previous frame does not exist, the predetermined tree structure is selected in operation 1410 using only CC, which is a value of the context of the current frame, as illustrated in FIG. 8B.
[194] In operation 1420, a probability of extracting each code of the codeword detected in operation 1400 using the tree structure selected in operation 1410.
[195] In operation 1420, the probability of extracting each code of the detected codeword can be calculated using a probability of extracting a code assigned to each node in the selected tree structure as represented by Equations 24 and 25.
Figure imgf000031_0001
(24) [197] Here,
denotes a probability of extracting
in an
node.
Figure imgf000031_0002
(25) [199] Here,
denotes a probability of extracting
in an
node. [200] In operation 1430, a difference between a representative value indicated by the codeword detected in operation 1400 and a value indicated by the quantized symbol input in operation 1400 is calculated. The representative value indicated by the codeword is a value pre-set to the codeword to be representative of symbols contained in the codeword according to a predetermined condition, such as the smallest value or a mean value of the symbols contained in the codeword.
[201] In operation 1430, a probability of detecting the quantized symbol input in operation 1400 from among the symbols contained in the codeword detected in operation 1400 is also calculated. When symbols are grouped with the uniform spacing as illustrated in FIG. 4, the probability of detecting the quantized symbol is a value calculated by setting the length of the uniform spacing to 1. Alternatively, when symbols are grouped with the non-uniform spacing as illustrated in FIG. 5, the probability of detecting the quantized symbol is a value calculated by setting the length of a spacing of a group to which the quantized symbol belongs to 1.
[202] In operation 1440, a bitstream is generated by arithmetic-encoding the codeword detected in operation 1400 and the difference between the representative value indicated by the detected codeword and the value indicated by the quantized symbol, which was calculated in operation 1430, using the probability of extracting each code of the detected codeword, which was detected in operation 1420, and the probability of detecting the quantized symbol in a relevant duration, which was calculated in operation 1430.
[203] FIG. 15 is a flowchart illustrating an entropy decoding method based on a tree structure according to an embodiment of the present general inventive concept.
[204] Referring to FIG. 15, a bitstream is received from an encoding end and arithmetic- decoded in operation 1500.
[205] A codeword is detected in operation 1510 using a probability of detecting each code of the codeword, which is a result of the arithmetic decoding performed in operation 1500, based on a tree structure in which an existence probability of each code is assigned to each node.
[206] In operation 1520, a difference between a representative value indicated by the codeword detected in operation 1510 and a value indicated by a symbol encoded in the encoding end is detected using the result of the arithmetic decoding performed in operation 1500.
[207] In operation 1530, a predetermined symbol is detected from among symbols grouped corresponding to the codeword detected in operation 1510 using the difference detected in operation 1520.
[208] FIG. 16 is a flowchart illustrating an entropy decoding method based on a tree structure according to another embodiment of the present general inventive concept. [209] Referring to FIG. 16, a bitstream is received from an encoding end and arithmetic- decoded in operation 1600.
[210] In operation 1610, a tree structure used in the encoding end is determined from among a plurality of tree structures stored in a decoding end. The decoding end stores a plurality of tree structures having the same structure, wherein an existence probability of each code is differently assigned to each node according to the plurality of tree structures. Accordingly, the plurality of tree structures stored in the decoding end have the same structure in which only the existence probability of each code is differently assigned to each node.
[211] As an example of a method of determining a tree structure in operation 1610, the tree structure used in the encoding end can be determined by receiving an index indicating the tree structure used in the encoding end from the encoding end.
[212] In operation 1620, a codeword corresponding to a probability of detecting each code of the codeword, which is a result of the arithmetic decoding performed in operation 1600, is detected using the tree structure determined in operation 1610.
[213] In operation 1630, a difference between a representative value indicated by the codeword detected in operation 1620 and a value indicated by a symbol encoded in the encoding end is detected using the result of the arithmetic decoding performed in operation 1600.
[214] In operation 1640, a predetermined symbol is detected from among symbols grouped corresponding to the codeword detected in operation 1620 using the difference detected in operation 1630.
[215] FIG. 17 is a flowchart illustrating an entropy decoding method based on a tree structure according to another embodiment of the present general inventive concept.
[216] Referring to FIG. 17, a bitstream is received from an encoding end and arithmetic- decoded in operation 1700.
[217] In operation 1710, a tree structure used in the encoding end is determined from among a plurality of tree structures stored in a decoding end. The decoding end stores a plurality of different tree structures in which an existence probability of each code is assigned to each node. While the plurality of tree structures stored in the decoding end in the embodiment of FIG. 16 have the same structure in which only the existence probability of each code is differently assigned to each node, the plurality of tree structures stored in the decoding end in the embodiment of FIG. 17 are different from each other in their structure.
[218] As an example of a method of determining a tree structure in operation 1710, the tree structure used in the encoding end can be determined by receiving an index indicating the tree structure used in the encoding end from the encoding end.
[219] In operation 1720, a codeword corresponding to a probability of detecting each code of the codeword, which is a result of the arithmetic decoding performed in operation 1700, is detected using the tree structure determined in operation 1710.
[220] In operation 1730, a difference between a representative value indicated by the codeword detected in operation 1720 and a value indicated by a symbol encoded in the encoding end is detected using the result of the arithmetic decoding performed in operation 1700.
[221] In operation 1740, a predetermined symbol is detected from among symbols grouped corresponding to the codeword detected in operation 1720 using the difference detected in operation 1730.
[222] The general inventive concept can also be embodied as computer-readable codes on a computer-readable medium. The computer-readable medium can include a computer- readable recording medium and a computer-readable transmission medium. The computer readable recording medium is any data storage device that can store data which can be thereafter read by a computer system. Examples of the computer readable recording medium include read-only memory (ROM), random- access memory (RAM), CD-ROMs, magnetic tapes, floppy disks, and optical data storage devices. The computer-readable recording medium can also be distributed over network coupled computer systems so that the computer-readable code is stored and executed in a distributed fashion. The computer-readable transmission medium can transmit carrier waves or signals (e.g., wired or wireless data transmission through the Internet). Also, functional programs, codes, and code segments to accomplish the present general inventive concept can be easily construed by programmers skilled in the art to which the present general inventive concept pertains.
[223] As described above, according to various embodiments of the present general inventive concept, entropy encoding decoding are performed based on a tree structure. By doing this, coding efficiency is increased, complexity is reduced, and additional information can be reduced. In addition, even when the present general inventive concept is applied to scalable coding, coding efficiency can be prevented from being decreased.
[224] Although a few embodiments of the present general inventive concept have been illustrated and described, it will be appreciated by those skilled in the art that changes may be made in these embodiments without departing from the principles and spirit of the general inventive concept, the scope of which is defined in the appended claims and their equivalents.

Claims

Claims
[ 1 ] What is claimed is :
[2] 1. An entropy encoding apparatus, comprising: a codeword detector to detect a codeword corresponding to a specific symbol from among stored codewords, each stored codeword corresponding to predetermined grouped symbols; a probability detector to detect a probability value corresponding to each code of the detected codeword based on a tree structure in which an existence probability of each code is assigned to each node; a difference calculator to calculate a difference between a representative value indicated by the detected codeword and a value indicated by the specific symbol and to calculate a probability of detecting the specific symbol from among symbols contained in the detected codeword; and an arithmetic encoder to arithmetically encode the detected codeword and the calculated difference using the calculated probabilities.
[3] 2. The entropy encoding apparatus of claim 1, wherein the symbols are grouped with a uniform spacing.
[4] 3. The entropy encoding apparatus of claim 1, wherein the symbols are grouped with a non-uniform spacing.
[5] 4. The entropy encoding apparatus of claim 3, wherein the symbols are grouped with a wide spacing in a duration in which probabilities significantly vary and grouped with a narrow spacing in a duration in which probabilities insignificantly vary.
[6] 5. An entropy encoding apparatus, comprising: a codeword detector to detect a codeword corresponding to a specific symbol from among stored codewords, each stored codeword corresponding to predetermined grouped symbols; a selector to select a predetermined tree structure from among a plurality of tree structures in which an existence probability of each code is differently assigned to each node, based on a context of previous symbols; a probability detector to detect a probability value corresponding to each code of the detected codeword based on the selected tree structure; a difference calculator to calculate a difference between a representative value indicated by the detected codeword and a value indicated by the specific symbol and calculating a probability of detecting the specific symbol from among symbols contained in the detected codeword; and an arithmetic encoder to arithmetically encode the detected codeword and the calculated difference using the calculated probabilities.
[7] 6. The entropy encoding apparatus of claim 5, wherein the selector selects a tree structure by analyzing the context of previous symbols and then analyzing a context of current symbols.
[8] 7. The entropy encoding apparatus of claim 5, wherein the selector selects a tree structure by analyzing a context of current symbols, if data of the previous symbols does not exist.
[9] 8. The entropy encoding apparatus of claim 5, wherein the symbols are grouped with a uniform spacing.
[10] 9. The entropy encoding apparatus of claim 5, wherein the symbols are grouped with a non-uniform spacing.
[11] 10. The entropy encoding apparatus of claim 9, wherein the symbols are grouped with a wide spacing in a duration in which probabilities significantly vary and grouped with a narrow spacing in a duration in which probabilities insignificantly vary.
[12] 11. An entropy encoding apparatus, comprising: a codeword detector to detect a codeword corresponding to a specific symbol from among stored codewords, each stored codeword corresponding to predetermined grouped symbols; a selector to select a predetermined tree structure from among a plurality of different tree structures in which an existence probability of each code is assigned to each node, based on a context of previous symbols; a probability detector to detect a probability value corresponding to each code of the detected codeword based on the selected tree structure; a difference calculator to calculate a difference between a representative value indicated by the detected codeword and a value indicated by the specific symbol and to calculate a probability of detecting the specific symbol from among symbols contained in the detected codeword; and an arithmetic encoder to arithmetically encode the detected codeword and the calculated difference using the calculated probabilities.
[13] 12. The entropy encoding apparatus of claim 11, wherein the selector selects a tree structure by analyzing the context of the previous symbols and then analyzing a context of current symbols.
[14] 13. The entropy encoding apparatus of claim 11, wherein the selector selects a tree structure by analyzing a context of current symbols, if data of the previous symbols does not exist.
[15] 14. The entropy encoding apparatus of claim 11, wherein the symbols are grouped with a uniform spacing.
[16] 15. The entropy encoding apparatus of claim 11, wherein the symbols are grouped with a non-uniform spacing.
[17] 16. The entropy encoding apparatus of claim 15, wherein the symbols are grouped with a wide spacing in a duration in which probabilities significantly vary and grouped with a narrow spacing in a duration in which probabilities insignificantly vary.
[18] 17. An entropy decoding apparatus, comprising: a codeword detector to detect a codeword by performing arithmetic-decoding based on a tree structure in which an existence probability of each code is assigned to each node; a difference detector to detect a difference between a representative value indicated by the detected codeword and a value indicated by a symbol specified in an encoding end by performing arithmetic-decoding; and a symbol detector to detect the specified symbol from among symbols grouped corresponding to the detected codeword using the detected difference.
[19] 18. The entropy decoding apparatus of claim 17, wherein the symbols are grouped with a uniform spacing.
[20] 19. The entropy decoding apparatus of claim 17, wherein the symbols are grouped with a non-uniform spacing.
[21] 20. The entropy decoding apparatus of claim 19, wherein the symbols are grouped with a wide spacing in a duration in which probabilities significantly vary and grouped with a narrow spacing in a duration in which probabilities insignificantly vary.
[22] 21. An entropy decoding apparatus, comprising: a tree structure determiner to determine a tree structure used in an encoding end from among a plurality of tree structures in which an existence probability of each code is differently assigned to each node; a codeword detector to detect a codeword by arithmetic-decoding a bitstream based on the determined tree structure; a difference detector to detect a difference between a representative value indicated by the detected codeword and a value indicated by a symbol specified in an encoding end by performing arithmetic-decoding; and a symbol detector to detect the specified symbol from among symbols grouped corresponding to the detected codeword using the detected difference.
[23] 22. The entropy decoding apparatus of claim 21, wherein the symbols are grouped with a uniform spacing.
[24] 23. The entropy decoding apparatus of claim 21, wherein the symbols are grouped with a non-uniform spacing.
[25] 24. The entropy decoding apparatus of claim 23, wherein the symbols are grouped with a wide spacing in a duration in which probabilities significantly vary and grouped with a narrow spacing in a duration in which probabilities insignificantly vary.
[26] 25. An entropy decoding apparatus, comprising: a tree structure determiner to determine a tree structure used in an encoding end from among a plurality of different tree structures in which an existence probability of each code is assigned to each node; a codeword detector to detect a codeword by performing arithmetic-decoding based on the determined tree structure; a difference detector to detect a difference between a representative value indicated by the detected codeword and a value indicated by a symbol specified in an encoding end by performing arithmetic-decoding; and a symbol detector to detect the specified symbol from among symbols grouped corresponding to the detected codeword using the detected difference.
[27] 26. The entropy decoding apparatus of claim 25, wherein the symbols are grouped with a uniform spacing.
[28] 27. The entropy decoding apparatus of claim 25, wherein the symbols are grouped with a non-uniform spacing.
[29] 28. The entropy decoding apparatus of claim 27, wherein the symbols are grouped with a wide spacing in a duration in which probabilities significantly vary and grouped with a narrow spacing in a duration in which probabilities insignificantly vary.
[30] 29. An entropy encoding method, comprising: detecting a codeword corresponding to a specific symbol from among stored codewords, each stored codeword corresponding to predetermined grouped symbols; detecting a probability value corresponding to each code of the detected codeword based on a tree structure in which an existence probability of each code is assigned to each node; calculating a difference between a representative value indicated by the detected codeword and a value indicated by the specific symbol and calculating a probability of detecting the specific symbol from among symbols contained in the detected codeword; and arithmetically encoding the detected codeword and the calculated difference using the calculated probabilities.
[31] 30. The entropy encoding method of claim 29, wherein the symbols are grouped with auniform spacing.
[32] 31. The entropy encoding method of claim 29, wherein the symbols are grouped with a non-uniform spacing.
[33] 32. The entropy encoding method of claim 31, wherein the symbols are grouped with a wide spacing in a duration in which probabilities significantly vary and grouped with a narrow spacing in a duration in which probabilities insignificantly vary.
[34] 33. An entropy encoding method, comprising: detecting a codeword corresponding to a specific symbol from among stored codewords, each stored codeword corresponding to predetermined grouped symbols; selecting a predetermined tree structure from among a plurality of tree structures in which an existence probability of each code is differently assigned to each node, based on a context of previous symbols; detecting a probability value corresponding to each code of the detected codeword based on the selected tree structure; calculating a difference between a representative value indicated by the detected codeword and a value indicated by the specific symbol and calculating a probability of detecting the specific symbol from among symbols contained in the detected codeword; and arithmetically encoding the detected codeword and the calculated difference using the calculated probabilities.
[35] 34. The entropy encoding method of claim 33, wherein the selecting comprises: selecting a tree structure by analyzing the context of previous symbols and then analyzing a context of current symbols.
[36] 35. The entropy encoding method of claim 33, wherein the selecting comprises: selecting a tree structure by analyzing a context of current symbols, if data of the previous symbols does not exist.
[37] 36. The entropy encoding method of claim 33, wherein the symbols are grouped with a uniform spacing.
[38] 37. The entropy encoding method of claim 33, wherein the symbols are grouped with a non-uniform spacing.
[39] 38. The entropy encoding method of claim 37, wherein the symbols are grouped with a wide spacing in a duration in which probabilities significantly vary and grouped with a narrow spacing in a duration in which probabilities insignificantly vary.
[40] 39. An entropy encoding method, comprising: detecting a codeword corresponding to a specific symbol from among stored codewords, each stored codeword corresponding to predetermined grouped symbols; selecting a predetermined tree structure from among a plurality of different tree structures in which an existence probability of each code is assigned to each node, based on a context of previous symbols; detecting a probability value corresponding to each code of the detected codeword based on the selected tree structure; calculating a difference between a representative value indicated by the detected codeword and a value indicated by the specific symbol and calculating a probability of detecting the specific symbol from among symbols contained in the detected codeword; and arithmetically encoding the detected codeword and the calculated difference using the calculated probabilities.
[41] 40. The entropy encoding method of claim 39, wherein the selecting comprises: selecting a tree structure by analyzing the context of the previous symbols and then analyzing a context of current symbols.
[42] 41. The entropy encoding method of claim 39, wherein the selecting comprises: selecting a tree structure by analyzing a context of current symbols, if data of the previous symbols does not exist.
[43] 42. The entropy encoding method of claim 39, wherein the symbols are grouped with a uniform spacing.
[44] 43. The entropy encoding method of claim 39, wherein the symbols are grouped with a non-uniform spacing.
[45] 44. The entropy encoding method of claim 43, wherein the symbols are grouped with a wide spacing in a duration in which probabilities significantly vary and grouped with a narrow spacing in a duration in which probabilities insignificantly vary.
[46] 45. An entropy decoding method, comprising: detecting a codeword by performing arithmetic-decoding based on a tree structure in which an existence probability of each code is assigned to each node; detecting a difference between a representative value indicated by the detected codeword and a value indicated by a symbol specified in an encoding end by performing arithmetic-decoding; and detecting the specified symbol from among symbols grouped corresponding to the detected codeword using the detected difference.
[47] 46. The entropy decoding method of claim 45, wherein the symbols are grouped with a uniform spacing.
[48] 47. The entropy decoding method of claim 45, wherein the symbols are grouped with a non-uniform spacing.
[49] 48. The entropy decoding method of claim 47, wherein the symbols are grouped with a wide spacing in a duration in which probabilities significantly vary and grouped with a narrow spacing in a duration in which probabilities insignificantly vary.
[50] 49. An entropy decoding method, comprising: determining a tree structure used in an encoding end from among a plurality of tree structures in which an existence probability of each code is differently assigned to each node; detecting a codeword by arithmetic-decoding a bitstream based on the determined tree structure; detecting a difference between a representative value indicated by the detected codeword and a value indicated by a symbol specified in an encoding end by performing arithmetic-decoding; and detecting the specified symbol from among symbols grouped corresponding to the detected codeword using the detected difference.
[51] 50. The entropy decoding method of claim 49, wherein the symbols are grouped with a uniform spacing.
[52] 51. The entropy decoding method of claim 49, wherein the symbols are grouped with a non-uniform spacing.
[53] 52. The entropy decoding method of claim 51, wherein the symbols are grouped with a wide spacing in a duration in which probabilities significantly vary and grouped with a narrow spacing in a duration in which probabilities insignificantly vary.
[54] 53. An entropy decoding method, comprising: determining a tree structure used in an encoding end from among a plurality of different tree structures in which an existence probability of each code is assigned to each node; detecting a codeword by arithmetic-decoding a bitstream based on the determined tree structure; detecting a difference between a representative value indicated by the detected codeword and a value indicated by a symbol specified in an encoding end by performing arithmetic-decoding; and detecting the specified symbol from among symbols grouped corresponding to the detected codeword using the detected difference.
[55] 54. The entropy decoding method of claim 53, wherein the symbols are grouped with a uniform spacing.
[56] 55. The entropy decoding method of claim 53, wherein the symbols are grouped with a non-uniform spacing.
[57] 56. The entropy decoding method of claim 55, wherein the symbols are grouped with a wide spacing in a duration in which probabilities significantly vary and grouped with a narrow spacing in a duration in which probabilities insignificantly vary.
[58] 57. A computer-readable recording medium having embodied thereon a computer program to execute a method, wherein the method comprises: detecting a codeword corresponding to a specific symbol from among stored codewords, each stored codeword corresponding to predetermined grouped symbols; detecting a probability value corresponding to each code of the detected codeword based on a tree structure in which an existence probability of each code is assigned to each node; calculating a difference between a representative value indicated by the detected codeword and a value indicated by the specific symbol and calculating a probability of detecting the specific symbol from among symbols contained in the detected codeword; and arithmetically encoding the detected codeword and the calculated difference using the calculated probabilities.
[59] 58. A computer-readable recording medium having embodied thereon a computer program to execute a method, wherein the method comprises: detecting a codeword corresponding to a specific symbol from among stored codewords, each stored codeword corresponding to predetermined grouped symbols; selecting a predetermined tree structure from among a plurality of tree structures in which an existence probability of each code is differently assigned to each node, based on a context of previous symbols; detecting a probability value corresponding to each code of the detected codeword based on the selected tree structure; calculating a difference between a representative value indicated by the detected codeword and a value indicated by the specific symbol and calculating a probability of detecting the specific symbol from among symbols contained in the detected codeword; and arithmetically encoding the detected codeword and the calculated difference using the calculated probabilities.
[60] 59. A computer-readable recording medium having embodied thereon a computer program to execute a method, wherein the method comprises: detecting a codeword corresponding to a specific symbol from among stored codewords, each stored codeword corresponding to predetermined grouped symbols; selecting a predetermined tree structure from among a plurality of different tree structures in which an existence probability of each code is assigned to each node, based on a context of previous symbols; detecting a probability value corresponding to each code of the detected codeword based on the selected tree structure; calculating a difference between a representative value indicated by the detected codeword and a value indicated by the specific symbol and calculating a probability of detecting the specific symbol from among symbols contained in the detected codeword; and arithmetically encoding the detected codeword and the calculated difference using the calculated probabilities.
[61] 60. A computer-readable recording medium having embodied thereon a computer program to execute a method, wherein the method comprises: detecting a codeword by performing arithmetic-decoding based on a tree structure in which an existence probability of each code is assigned to each node; detecting a difference between a representative value indicated by the detected codeword and a value indicated by a symbol specified in an encoding end by performing arithmetic-decoding; and detecting the specified symbol from among symbols grouped corresponding to the detected codeword using the detected difference.
[62] 61. A computer-readable recording medium having embodied thereon a computer program to execute a method, wherein the method comprises: determining a tree structure used in an encoding end from among a plurality of tree structures in which an existence probability of each code is differently assigned to each node; detecting a codeword by arithmetic-decoding a bitstream based on the determined tree structure; detecting a difference between a representative value indicated by the detected codeword and a value indicated by a symbol specified in an encoding end by performing arithmetic-decoding; and detecting the specified symbol from among symbols grouped corresponding to the detected codeword using the detected difference.
[63] 62. A computer-readable recording medium having embodied thereon a computer program to execute a method, wherein the method comprises: determining a tree structure used in an encoding end from among a plurality of different tree structures in which an existence probability of each code is assigned to each node; detecting a codeword by arithmetic-decoding a bitstream based on the determined tree structure; detecting a difference between a representative value indicated by the detected codeword and a value indicated by a symbol specified in an encoding end by performing arithmetic-decoding; and detecting the specified symbol from among symbols grouped corresponding to the detected codeword using the detected difference.
[64] 63. An entropy encoding apparatus, comprising: a codeword detector to detect a codeword corresponding to a respective symbol from one or more stored codewords, each stored codeword corresponding to predetermined grouped symbols; and a probability detector to detect a probability value corresponding to each code of the detected codeword based on at least one predetermined tree structure in which an existence probability of each code is assigned to each node.
[65] 64. The apparatus of claim 63, further comprising: a selector to select the predetermined tree structure from a plurality of predetermined tree structures based on a context of previous symbols.
[66] 65. The apparatus of claim 63, further comprising: a difference calculator to calculate a difference between a representative value indicated by the detected codeword and a value indicated by the respective symbol and to calculate a probability of detecting the respective symbol from one or more symbols contained in the detected codeword; and an arithmetic encoder to arithmetically encode the detected codeword and the calculated difference using the calculated probabilities.
[67] 66. An entropy decoding apparatus, comprising: a tree structure determiner to determine a tree structure used to encode a signal from one or more tree structures in which an existence probability of each code is differently assigned to each node; and a codeword detector to detect a codeword by arithmetic-decoding a bitstream based on the determined tree structure.
[68] 67. The apparatus of claim 66, further comprising: a difference detector to detect a difference between a representative value indicated by the detected codeword and a value indicated by a respective symbol used to encode the signal by performing arithmetic-decoding; and a symbol detector to detect the specified symbol from one or more symbols grouped corresponding to the detected codeword using the detected difference.
[69] 68. A method of encoding a signal, the method comprising: detecting a codeword corresponding to a respective symbol from one or more stored codewords in which each stored codeword corresponds to predetermined grouped symbols; and detecting a probability value corresponding to each code of the detected codeword based on at least one predetermined tree structure in which an existence probability of each code is assigned to each node. [70] 69. A method of decoding a signal, the method comprising: determining a tree structure used to encode a signal from one or more tree structures in which an existence probability of each code is differently assigned to each node; and detecting a codeword by arithmetic-decoding a bitstream based on the determined tree structure.
PCT/KR2008/001187 2007-03-08 2008-02-29 Entropy encoding and decoding apparatus and method based on tree structure WO2008108557A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP08723225A EP2127385A4 (en) 2007-03-08 2008-02-29 Entropy encoding and decoding apparatus and method based on tree structure
JP2009552581A JP4865872B2 (en) 2007-03-08 2008-02-29 Apparatus and method for entropy encoding and decoding

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2007-0023168 2007-03-08
KR1020070023168A KR101365989B1 (en) 2007-03-08 2007-03-08 Apparatus and method and for entropy encoding and decoding based on tree structure

Publications (1)

Publication Number Publication Date
WO2008108557A1 true WO2008108557A1 (en) 2008-09-12

Family

ID=39738402

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2008/001187 WO2008108557A1 (en) 2007-03-08 2008-02-29 Entropy encoding and decoding apparatus and method based on tree structure

Country Status (5)

Country Link
US (1) US7528750B2 (en)
EP (1) EP2127385A4 (en)
JP (1) JP4865872B2 (en)
KR (1) KR101365989B1 (en)
WO (1) WO2008108557A1 (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8406546B2 (en) * 2009-06-09 2013-03-26 Sony Corporation Adaptive entropy coding for images and videos using set partitioning in generalized hierarchical trees
KR101730200B1 (en) * 2009-07-01 2017-04-25 톰슨 라이센싱 Methods for arithmetic coding and decoding
KR20220127367A (en) 2009-07-02 2022-09-19 인터디지털 브이씨 홀딩스 인코포레이티드 Methods and apparatus for video encoding and decoding binary sets using adaptive tree selection
CA2778323C (en) 2009-10-20 2016-09-20 Fraunhofer-Gesellschaft Zur Foerderung Der Angewandten Forschung E.V. Audio encoder, audio decoder, method for encoding an audio information, method for decoding an audio information and computer program using a detection of a group of previously-decoded spectral values
SG182467A1 (en) * 2010-01-12 2012-08-30 Fraunhofer Ges Forschung Audio encoder, audio decoder, method for encoding and audio information, method for decoding an audio information and computer program using a hash table describing both significant state values and interval boundaries
US8063801B2 (en) * 2010-02-26 2011-11-22 Research In Motion Limited Encoding and decoding methods and devices using a secondary codeword indicator
EP2362547B1 (en) * 2010-02-26 2017-10-11 BlackBerry Limited Encoding and decoding methods and devices using a secondary codeword indicator
JP2024515174A (en) * 2021-04-15 2024-04-05 エルジー エレクトロニクス インコーポレイティド Point cloud data transmission method, point cloud data transmission device, point cloud data receiving method and point cloud data receiving device

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020037111A1 (en) * 1996-03-19 2002-03-28 Mitsubishi Denki Kabushiki Kaisha Encoding apparatus, decoding apparatus, encoding method, and decoding method
US20030202710A1 (en) * 2002-04-25 2003-10-30 Ngai-Man Cheung Entropy coding scheme for video coding
US20070046504A1 (en) * 2005-07-21 2007-03-01 Nokia Corporation Adaptive variable length codes for independent variables

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR960015195A (en) * 1994-10-31 1996-05-22 배순훈 Tree structure binary operation coding device
KR100370416B1 (en) * 1996-10-31 2003-04-08 삼성전기주식회사 Encoding/decoding method for recording/reproducing high-density data and system based thereon
US7274671B2 (en) * 2001-02-09 2007-09-25 Boly Media Communications, Inc. Bitwise adaptive encoding using prefix prediction
US7265692B2 (en) * 2004-01-29 2007-09-04 Hewlett-Packard Development Company, L.P. Data compression system based on tree models

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020037111A1 (en) * 1996-03-19 2002-03-28 Mitsubishi Denki Kabushiki Kaisha Encoding apparatus, decoding apparatus, encoding method, and decoding method
US20030202710A1 (en) * 2002-04-25 2003-10-30 Ngai-Man Cheung Entropy coding scheme for video coding
US20070046504A1 (en) * 2005-07-21 2007-03-01 Nokia Corporation Adaptive variable length codes for independent variables

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP2127385A4 *

Also Published As

Publication number Publication date
JP2010520696A (en) 2010-06-10
KR20080082376A (en) 2008-09-11
EP2127385A4 (en) 2010-09-29
US20080218390A1 (en) 2008-09-11
EP2127385A1 (en) 2009-12-02
KR101365989B1 (en) 2014-02-25
JP4865872B2 (en) 2012-02-01
US7528750B2 (en) 2009-05-05

Similar Documents

Publication Publication Date Title
EP2127385A1 (en) Entropy encoding and decoding apparatus and method based on tree structure
EP1826908A1 (en) Cabac-based encoding and decoding using an improved context model selection
CN102098508B (en) The coding of multimedia signature and decoding
KR101118089B1 (en) Apparatus and system for Variable Length Decoding
EP3550726A1 (en) Methods and devices for reducing sources in binary entropy coding and decoding
US20120093213A1 (en) Coding method, coding apparatus, coding program, and recording medium therefor
US10133551B1 (en) Content-aware compression of data using multiple parallel prediction functions
US20130082850A1 (en) Data encoding apparatus, data decoding apparatus and methods thereof
CN113381768B (en) Huffman correction coding method, system and related components
CN109981108B (en) Data compression method, decompression method, device and equipment
CN100493199C (en) Coding apparatus, coding method, and codebook
CN102547260B (en) Decoding method of adaptive variable length coding based on context and system thereof
CN110491398B (en) Encoding method, encoding device, and recording medium
US8055506B2 (en) Audio encoding and decoding apparatus and method using psychoacoustic frequency
CN116256025B (en) Aeration data monitoring system of ultra-filtration water device
CN104767997A (en) Video-oriented visual feature encoding method and device
RU2003109615A (en) DEVICE AND METHOD FOR DETECTING TURBO DECODER DATA TRANSFER
CN116827351A (en) Intelligent monitoring system for temperature of graphene heating wall surface
CN109417626A (en) The last coefficient of video compress based on adaptive transformation encodes
US20130246076A1 (en) Coding of strings
KR101541869B1 (en) Method for encoding and decoding using variable length coding and system thereof
JP5959474B2 (en) Encoding device, decoding device, method, and program
EP1251434A2 (en) Method and device for learning correlation matrix
US20160323603A1 (en) Method and apparatus for performing an arithmetic coding for data symbols
US20090063161A1 (en) Method and apparatus for encoding and decoding continuation sinusoidal signal of audio signal

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 08723225

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2009552581

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 2008723225

Country of ref document: EP