Instantaneous Codes , Optimal Codes , The Kraft Inequality in digital communication

The Kraft Inequality in digital communication , Instantaneous Codes , Optimal Codes :-
9.19.3.6. Instantaneous Codes
            A uniquely decodable code is called an instantaneous code if the end of any codeword is recognizable without examining subsequent code symbols. The instantaneous codes have the property previously mentioned that no codeword is a prefix of another codeword. For this reason, prefix-free codes are sometimes known as instantaneous codes.
9.19.3.7. Optimal Codes
            A code is said to be optimal if it is instantaneous and has minimum average L for a given source with a given probability assuagement for the sourcesymbols.
9.19.4. The Kraft Inequality
            Let Xbe a DMS with alphabet {xi} (i = 1, 2, … m). Assume that the length of the assigned binary codeword corresponding to xi is ni.
A necessary and sufficient condition for the existence of an instantaneous binary code is
equation
which is known as the Kraft inequality.
NOTE: It may be noted that Kraft inequality assures us of the existence of an instantaneously decodable code with codeword lengths that satisfy the inequality. But it does not show us how to obtain these codewords, nor does it say any code satisfies the inequality is automatically uniquely decodable.
EXAMPLE 9.37. Given a DMS X with two symbols xl and x2 and P(xi) = 0.9, P(x2) = 0.1. Symbols x1, and x2 are encoded as follows (Table 9.3)
Table 9.3.

xi P(xi) Code
x1
x2
0.9
0.1
0
1

            Find the efficiency h and the redundancy γ of this code.
Solution: We know that the average code length L per symbol is
equaTion
We know that
equation
= -0.9 log2 0.9 – 0.1 log2 0.1 = 0.469 b/symbol
Also, the code efficiency h is
h =  = 0.469  = 46.9%
And, the code redundancy γ is given by
γ = 1 – h = 0.531 = 53.1%                   Ans.
EXAMPLE 9.38. The second-order extension of a DMS X, denoted by X2, is formed by taking the source symbols two at a time. The coding of this extension has been shown in Table 9.4. Find the efficiency h and the redundancy γ of this extension code.
Solution:                                                         Table 9.4.

ai P(ai) Code
a1 = x1x1
a2 = x1x2
a3 = x2x1
a4 = x2x2
0.81
0.09
0.09
0.01
0
10
110
111

We have  (ai)ni = 0.81(1) + 0.09(2) + 0.09(3) + 0.01(3) = 1.29 b/symbol
The entropy of the second-order extension of X, H(X2), is given by
equation
= – 0.81 log2 0.81 – 0.09 log2 0.09 – 0.09 log2 0.09- 0.01 log2 0.01
or         H(X2) = 0.938 b/symbol
Therefore, the code efficiency Ƞ is
Ƞ =  = 0.727 = 72.7%
Also, the code redundancy γ will be
γ = 1 – Ƞ = 0.273 = 27.3%       Ans.
EXAMPLE 9.39. Consider a DMS X with symbols xi, i = 1, 2, 3, 4. Table 9.5 lists four possible binary codes.
Table 9.5.

xi Code A Code B Code C Code D
x1
x2
x3
x4
00
01
10
11
0
00
11
110
0
10
11
110
0
100
110
111

 
            (i)         Show that all the codes except code B satisfy the Karft inequality.
            (ii)        Show that codes A and D are uniquely decodable but codes B and C are not uniquely decodable.
Solution: We have
(i)         For code A,                             n1 = n2 = n3 = n4 = 2
therefore,                                             Equation
For code B,                                         n1 = 1, n2 = n3 = 2, n4 = 3
therefore                                              Equation
For code C,                                         n1 = 1, n2 = 2, n3 = n4 = 3
therefore                                              Equation
For code D,                                         n1 = 1, n2 = n3 = n4 = 3
therefore,                                             equation
Thus, all codes except code B satisfy the Kraft inequality.
(ii)        Codes A and D are prefix-free codes. They are, therefore, uniquely decodable. Code B does not satisfy the Kraft inequality, and it is not uniquely decodable. Although code C does satisfy the Kraft inequality, but it is not uniquely decodable. This can be seen by the following example:
Given the binary sequence 0110110. This sequence may correspond to the source sequences x1x2x1x4 or x1x4x4.
EXAMPLE 9.40. Verify the following expression:
where L is the average codeword length per symbol and H(X) is the source entropy.                                                                                                  (Gate Examination-2000)
Solution: We know that
Equation
where the equality holds only if Qi = Pi.
Let                                                       Equation
where                                      Equation
Now, we have
            equation
and                                                      equation
            equation
= H(X) -L – log2 K < 0
From the Kraft inequlity (14.56), we have
log2 K < 0
Thus,                                       H(X) – L ≤  log2 K ≤ 0
or                                             L ≥ H(X)
The equality holds when K= 1 and Pi= Qi
EXAMPLE 9.41. Let X be a DMS with symbols xi and corresponding probabilities P(xi) = Pi, i = 1,2, …m. Show that for the optimum source encoding, we require that
Hence proved.
equation
            and                                          ni = log2  = Ii
            where ni is the length of the codeword corresponding to xi and Ii is the information content of xi.
Solution: From the result of last problem, the optimum source encoding with L = H(X) requires K = 1 and P = Qi. Thus, equations using (ii) and (i) of last problem, we have
equation
and                                          Pi = Qi = 2-ni
Hence                                      ni = -log2 Pi = log2
NOTE: Note that equation (ii) implies the following commonsense principle.
Symbols which occur with high probability should be assigned shorter codewords than symbols that occur with low probability.                                    Hence proved.
EXAMPLE 9.42. Consider a DMS X with symbols xi and corresponding probabilities P(xi) = Pi, i = 1, 2, … m. Let ni be the length of the codeword of xi such that
log2   ni ≤  log2
            Show that this relationship satisfies the Kraft inequality (9.56), and find the bound on K in equation (9.56).
Solution: Equation (i) can be rewritten as
– log2 Pi ≤ ni ≤ – log2 Pi +1
or                                 log2 Pi ≥ – ni ≥ log2 Pi – 1
Then                            2log2 Pi ≥ 2-ni ≥ 2log2 Pi 2-1
or                                 Pi ≥ 2-ni ≥  Pi
Thus                            Equation
or                                 equation
which indicates that the Kraft inequality (9.56) is satisfied, and the bound on K will be
≤ K ≤ 1                     Hence proved.
EXAMPLE 9.43. Consider a DMS X with symbols xi and corresponding probabilities P(xi) = Pi, i 1, 2, …,m. Show that a code constructed in agreement with equation (i) in last problem gill satisfy the following relation:
H(X) L H(X) + 1
            where H(X) is the source entropy and L is the average codeword length.
Solution: Multiplying equation (ii) in last problem by Pi and summing over i yields
equation
Now,                                           equation
Thus, equation (ii) reduces to
H(X) ≤ L ≤ H(X) + 1              Hence proved.