GNDEC,Bidar
E&CE,DEPT
Faculty: Namratha
Semester :V(A)
Subject Code
|
:
|
10EC55
|
IA Marks
|
:
|
25
|
Subject Name
|
:
|
INFORMATION
THEORY AND CODING
|
-
|
-
|
-
|
No.
of Lecture Hrs./ Week
|
:
|
05
|
Exam
Hours
|
:
|
03
|
Total
No. of Lecture Hrs.
|
:
|
52
|
Exam
Marks
|
:
|
100
|
PART - A
UNIT - 1
INFORMATION
THEORY: Introduction, Measure of
information,Average information content of symbols in long independent
sequences,Average information content of symbols in long dependent sequences.
Markoff
statistical
model for information source, Entropy and information rate of mark-off source. 7 Hours
UNIT - 2
SOURCE CODING: Encoding of the source output, Shannon ’s
encoding algorithm. Communication Channels, Discrete communication
channels,Continuous channels. 6 Hours
UNIT - 3
FUNDAMENTAL
LIMITS ON PERFORMANCE: Source
coding theorem, Huffman coding, Discrete memory less Channels, Mutual
information, Channel Capacity. 7 Hours
UNIT - 4
Channel
coding theorem, Differential entropy and mutual information for continuous
ensembles, Channel capacity Theorem. 6 Hours
PART - B
UNIT - 5
INTRODUCTION TO
ERROR CONTROL CODING: Introduction,
Types of errors, examples, Types of codes Linear Block Codes: Matrix
description, Error detection and correction, Standard arrays and table look up
for
decoding. 7 Hours
UNIT - 6
Binary
Cycle Codes, Algebraic structures of cyclic codes, Encoding using an (n-k) bit
shift register, Syndrome calculation. BCH codes. 6 Hours
UNIT - 7
RS
codes, Golay codes, Shortened cyclic codes, Burst error correcting codes.Burst
and Random Error correcting codes. 7 Hours
UNIT
- 8
Convolution
Codes, Time domain approach. Transform domain approach 6 Hours
TEXT
BOOKS:
1.
Digital
and analog communication systems,
K. Sam Shanmugam,John Wiley, 1996.
2.
Digital
communication, Simon Haykin, John Wiley, 2003.
REFERENCE
BOOKS:
1. ITC and
Cryptography, Ranjan Bose, TMH, II edition, 2007
2.
Digital
Communications - Glover and Grant; Pearson Ed. 2nd Ed
2008
Assignment QuestionS
UNIT-1
1) Justify that the information
content of a message is a logarithmic function of its probability of emission.
2) Derive an expression for
average information content(entropy) of long independent messages.
3) An analog signal band limited to 4 khz is
sampled at twice the Nyquist rate and then quatized into 8 levels Q1,Q2---Q8 of
these 2 levels occur with a probability of 1/4 each 2 with probability of 1/8
each and the remaining four with a probability of 1/16 each respectively .
Find the information rate associated with the
analog signal.
4) What is markoff’s source ?
Explain with an example.
5) Define a) Symbol rate
b) self information
c) Zero memory source
d) Average self information (entropy)
e) information rate.
6)
A Binary source is emitting an
independent sequence of 0’s and 1’s with the probabilities P and (1-P) respectively. Plot the entropy of the
source verus probability [0<P<1].Write the conclusion
7 )
In a facsimile transmission of picture there are about 3.25 * 106 pixels per frame. For a good Reproduction, 15 brightness
levels are necessary. Assume all these levels are equally likely to
Occur. Find the rate of
information transmission if one picture is to be transmitted every 3
Minutes.
8)
Derive an expression for the entropy of nth extension of a
zero memory source.
9)
Explain entropy and information rate of mark off sources.
10)
Consider a DMS with probabilities {0.7,0.15,0.15}
a) Calculate entropy of the
source
b) Calculate second order
extension of the source
UNIT-2
1) Explain Shannon
encoding algorithm. Design an encoder using Shannon
encoding algorithm for a source having 5 symbols and probability statistics
P={1/8,1/16,3/16,1/4,3/8}.Find coding efficiency and redundancy.
2) Explain with a neat block diagram, the digital communication system indicating the various
types of communication channels.
3) State and prove shannons first theorems
(noiseless coding theorem)
4) Find the minimum no. of
symbols r in the coding alphabet for devising a instantaneous code such that
W={0,5,0,5,5}.Device such a code W represent the set of code words of length
1,2,3.
5) Another technique used in
constructing a source encoder consists of arranging the messages in decreasing
order of probability and dividing the message into two almost equally probable
groups.The messages in the first group age given the bit 0 and the messages in
the second group age given the bit 1.
The procedure is now applied again for each group separately,and continued
until no futher division is possible. Using this algorithm,find the code works
for six messages occurring with probabilities 1/3,1/3,1/6,1/12,1/24,1/24.
6) Derive the expression for code efficient and
code redundancy.
7) Explain the steps in the shannon ’s encoding algorithm for generating binary codes.
8) Apply Shannon ’s
encoding algorithm and generate binary codes for messages given and obtain code
efficient and code redundancy.
m1
-1/8, m2-1/16,m3-3/16 ,m4-1/4, m5-3/8.
9) Explain the steps involved in
shannon ’s fano encoding algorithm.
10) Give the probabilities
0.4,0.2,0.2,0.1,0.07,0.03 construct a binary code by applying shannon ’s
fano encoding procedure.
1) A source emits an independent
sequence of symbols from an alphabet consisting of 5 symbols A,B,C,D and E with
probabilities P={0.4,0.2,0.2,0.1,0.1}Determine Huffman code by i) shifting the
combined symbols as high as possible. ii) Find coding efficiency and variance
of both the codes.
2) The input to the channel
consists of 5 letters X={x1,x2,x3,x4,x5} and output consists of four letter
Y={y1,y2,y3,y4,y5}.The JPM of this channel is given in fig.
Y1 y2 y3
y4
X1 0.25
0 0 0
X2
0.1 0.3 0
0
X3 0
0.05 0.1 0
X4 0
0 0.05 0.1
X5 0
0 0.05 0
i)
Compute
H(x),H(y),H(x,y),H(y/x)and H(x/y)
ii)
Rate
of data transmission and mutual information
iii)
Channel
capacity, channel efficiency and redundancy.
3) Discuss the various
properties of mutual information.
4) A binary symmetric channel
has the following noise matrix with source probability of P(x1)=2/3 and
p(x2)=1/3.
P(Y/X)=
3/4 1/4
1/4 3/4
I )Determine H(x),H(y),H(x,y),H(y/x)and H(x/y) and I(x,Y)
II) find the channel
capacity C.
III) Find channel efficiency and redundancy
5) A transmitter transmits five
symbols with probabilities 0.2,0.3,0.2,0.1
And
0.2.Given the channel matrix P(B/A),calculate i) H(B) ii)H(A,B).
1 0
0 0
P(B/A) = 1/4
3/4 0 0
0 1/3
2/3 0
0 0 1/3 2/3
0 0 1 0
6) Define mutual information and
explain its properties.
7) Explain rate of information
transmission over a discrete channel.
8) A zero memory source is with
probabilities P={0.4,0.2,0.1,0.1,0.1,0.05,0.05} construct a binary Huffman
code.
9) State and discuss shannon ’s theorem on channel capacity.
10) Define binary erasure channel
and obtain an expression for its channel capacity.
UNIT-5
UNIT-5
1) Explain the matrix
representation of linear block codes.
2) Design (n,k) hamming code
with a minimum distance of dmin=3 and a message lengths of 4 bits.
3) Design a(4,2) linear block
code:
i)
Find
the generator matrix for the code vector set.
ii)
Find
the parity check matrix
iii)
Choose
the code – vectors to be in systematic form, with the goal of maximizing dmin.
4) If C is a valid code-vector
C=DG then prove that CHT=0.
5) Define Hamming weight
,Hamming distance, minimum.
6) Design linear block code with
minimum distance of three and message block size of 8 bit.
7) Why do we need error control
coding?.What are the types of error and types of coding to combat them.
8) Distinguish between block
codes and convolution codes.
9) A (6,3) code has the
following check bits
C4=d1+d2
C5=d1+d3
C6=d2+d3
i)
Write
G and H matrices
ii)
Construct
encoder and error correcting circuits.
10) Consider a systematic (7,4) L
B C for which the syndrome digits are given by
S1=r1+r4+r6+r7
S2=r2+r4+r5+r6
S3=r3+r5+r6+r7
i)
Write G and H matrices
ii)
Construct encoder and syndrome
circuits.
No comments:
Post a Comment