What is a Discrete Memoryless Channels (DMC) ? Explain , What do you mean by Information Rate?

What do you mean by Information Rate? , What is a Discrete Memoryless Channels (DMC) ? :-
SHORT QUESTIONS WITH ANSWERS
Q.1.     What is the meaning of word ‘Information’?
Ans.    On an conceptual basis the amount of information received from the knowledge of occurrence of a event may be related to the likelihood or probability of occurrence of that event. The message associated with the least likelihood event thus consists of maximum information. The above amount of information in a message depends only upon the uncertainty of the underlying event rather than its acutal content.
Q.2.     What are Information Source? Explain.
Ans.    An information source may be viewed as an object which produces an event, the outcome of which is selected at random according to a probability distribution. A practical source in a communication system is a device which produces messages, and it can be either analog or discrete. In this chapter, we deal mainly with the discrete sources since analog sources can be transformed to discrete sources through the use of sampling and quantization techniques, described in chapter 10. As a matter of fact, a discrete information source is a source which has only a finite set of symbols as possible outputs. The set of source symbols is called the source alphabet, and the elements of the set are called symbols or letters.
Information sources can be classified as having memory or being memoryless. A source with memory is one for which a current symbol depends on the previous symbols. A memoryless source is one for which each symbol produced is independent of the previous symbols.
A discrete memoryless source (DMS) can be characterized by the list of the symbols, the probability assignnzent to these symbols, and the specification of the rate of generating these symbols by the source.
Q.3.     Define the Information content of a symbol.
Ans.    The information content of a symbol xi, denoted by I(xi) is defined by
equation
where P(xi) is the probability of occurrence of symbol xi.
Q.4.     What do you mean by Information Rate? Explain.
Ans.    If the time rate at which source X emits symbols is r (symbols s), the information rate R of the source is given by
R = rH(X) b/s
Here R is information rate.
H(X) is Entropy or average information
and r is rate at which symbols are generated.
Information rate R is represented in average number of bits of information per second. It is calculated as under:
R = Information bits/second
Q.5.     What is a Discrete Memoryless Channels (DMC)? Explain.
Ans.    A communication channel may be defined as the path or medium through which the symbols flow to the receiver end. A discrete mamoryless channel (DMC) is a statistical model with an input X and an output Y as shown in figure. During each unit of the time (signaling interval), the channel accepts an input symbol from X, and in response it generates an output symbol from Y. The channel is said to be “discrete” when the alphabets of X and Y are both finite. Also, it is said to be “memoryless” when the current output depends on only the current input and not on any of the previous inputs.
diagram
figure 9.19
QUESTIONS

  1. Explain the concept of information.
  2. What is Entropy? Explain.
  3. What is information rate?
  4. What are discrete memoryless channels (DMC)?
  5. What is source coding?
  6. Write short notes on the following:

(i)         Source-coding theorem,          (ii)  Kraft-inequality

  1. Explain Shanon-Fano Algorithm.

PROBLEMS

  1. Consider a source X which produces five symbols with probabilities , , , , and . Find the source entropy H(X).     [Ans. 1.875 b/symbol]
  2. Calculate the average information content in the English language, assuming that each of the 26 characters in the alphabet occurs with equal probability.

[Ans. 4.7 b/character]

  1. Two BSCs are connected in cascade, as shown in figure 9.20.

diagram
figure 9.20
(i)         Find the channel matrix of the resultant channel.
(ii)        Find P(z1) and P(z2) if P(x1) = 0.6 and P(x2) = 0.4   (Karnataka University-1999)

  1. Consider the DMC shown in figure 9.21.

(i)         Find the output probabilities if P(x1) =  and P(x2) = P(x3) = .
(ii)        Find the output entropy H(Y).
[Ans. (i) P(y1) = 7/24, P(y2) = 17/48, and P(y3) = 17/48 (ii) 1.58 b/symbls]
diagram
FIGURE 9.21.

  1. Verify equations, that is, I(X:Y) = H(X) + H(Y) – H(X, Y)
  2. Show that H(X, Y) ≤ H(X) + H(Y) with equality if and only X and Y are independent.
  3. Show that for a deterministic channel

H(Y | X) = 0
Hint:   Use equations and note that for a deterministic channel P(yj|xi) are either 0 or 1.

  1. Consider a channel with an input X and an output Y. Show that if X and Y are statistically independent, then H(X | Y) = H(X) and I(X:Y) = 0
  2. A channel is described by the following channel matrix:

(i)         Draw the channel diagram.
(ii)        Find the channel capacity.                      (Anna University, Chennai-2004)
[Ans. (i) See figure 9.15, (ii) 1 b/symbol]
diagram
figure 9.22

  1. Let X be a random variable with probability density function fX(x), and let Y = aX + b, where a and b are constants. Find H(Y) in terms of H(X).

[Ans. H(Y) = H(X) + log2 a]

  1. Find the differential entropy H(X) of a Gaussian variable X with zero mean and variance .                                         (Anna University, Chennai-2004)

[Ans. H(X) =  log2 (2pe )]

  1. Consider an AWGN channel described by equations, that is,

Y = X+ n
where X and Yare the channel input and output, respectively, and n is additive white Gaussian noise with zero mean and variance . Find the average mutual information I(X; Y) when the channel input X is Gaussian with zero mean and variance .
[Ans. I(X;Y) =  log2 ] (Pune University-1993)
 

  1. Calculate the capacity of AWGN channel with a bandwidth of 1 MHz and an S/N ratio of 40-dB.

(Ans. 13.29 Mb/s)
OBJECTIVE TYPE QUESTIONS
Multiple Choice Questions

  1. A binary symmetric channel (BSC) transmitting l’s and 0’s with equal probability has an error rate of 10-2. The channel transmission rate (in bit/symbol) will be

(a)        0.99                                            (c)     0.95
(b)        0.919                                          (d)    1

  1. A communication system is used to transmit one of 16 possible signals. Assume that transmission is accomplished by encoding the signals into binary digits. If each binary digit requires 1 μ sec for transmission then how much information (in bits) is transmitted by the system in 8 μ sec?

(a)        8                                                 (b)     16
(c)        4                                                 (d)    2

  1. The baudrate when using binary transmission is

(a)        always equal to the bit rate
(b)        equal to twice the BW of an ideal channel
(c)        not equal to the signaling rate
(d)       equal to one hlaf of the BW of ideal channel

  1. A zero source generates two messages with prob. 0.8 and 0.2. These are coded as 1 and 0.2. The code efficiency is

(a)        0.2                                              (b)     0.5
(c)        0.7                                              (d)    1.0

  1. A communication channel with AWGN has a BW of 4 kHz and an SNR of 15. Its channel capacity is

(a)        1.6 kbps                                      (b)     16 kbps
(c)        32 kbps                                       (d)    456 kbps

  1. In a communication system, each message (1 or 0) is transmitted 3 times in order to reduce Pe, (prob. of error). The detection is based on the majority rule at the Rx. If Pe, is the prob. of bit error, the Pe for this comm. system will be

(a)        3                                (b)     – Pe
(c)                                                       (d)     (1 – Pe)

  1. A source delivers symbols m1, m2, m3 and m4 with prob. 1/2, 1/4, 1/8 and 1/8 resp. The entropy of the system is p

(a)        1.7 hits/sec
(b)        1.75 bits/symbol
(c)        1.75 symbols
(d)       1.75 symbol/bit

  1. The channel capacity of a 5 kHz bandwidth binary system is

(a)        10,000 bits/sec                           (b)     5000 hits/sec
(c)        8000 bits/sec                              (d)    4000 bits/sec

  1. A source generates 4 messages. The entropy of the source will be maximum when

(a)        all probabilities equal
(b)        one of the probabilities equal 1 and 2 others are zero
(c)        the probabilities are 1/2, 1/4 and 1/2
(d)       the two of the probabilities are 1/2 each and other zero

  1. A source generates 4 equip-robable symbols. If source coding is employed, the average length of code for 100% efficiency is

(a)        6 bits/symbol                              (b)     4 hits/symbol
(c)        3 bits/symbol                              (d)    2 bits/symbol

  1. The entropy of a message source generating 4 message with prob. 0.3, 0.25. 0.125 and 0.125 is

(a)        1 b/message                                (b)     1.75 b/message
(c)        3.32 b/message                           (d)    5.93 b/message

  1. The capacity of a communication channel with a bandwidth of 4 KHz and 15 SNR is approx.

(a)        20 kbps                                       (b)     16 kbps
(c)        10 kbps                                       (d)    8 kbps
Answers

  1. (a) 2.    (a) 3.   (a)  4.   (a)
  2. (b) 6.    (a)             7.   (b) 8.   (b)
  3. (a) 10.  (d)            11. (b) 12. (b)

Leave a Reply

Your email address will not be published. Required fields are marked *