Professional Documents
Culture Documents
Source Coding
1- Sampling theorem:
Sampling of the signals is the fundamental operation in digital
communication. A continuous time signal is first converted to
discrete time signal by sampling process. Also it should be possible
to recover or reconstruct the signal completely from its samples.
The sampling theorem state that:
i- A band limited signal of finite energy, which has no frequency
components higher than W Hz, is completely described by
specifying the values of the signal at instant of time separated
by 1/2W second and
ii- A band limited signal of finite energy, which has no frequency
components higher than W Hz, may be completely recovered
from the knowledge of its samples taken at the rate of 2W
samples per second.
To proof of sampling theorem: Let x(t) the continuous time signal
shown in figure below, its band width does not contain any
frequency components higher than W Hz. A sampling function
samples this signal regularly at the rate of fS sample per second.
Assume an analog waveform,
X (f ),
x (t )
30
2016-09-23
DR. MAHMOOD
x (t)
x (t)
with periodic
defined as
x ( t )=
n =
(tn T s)
x s (t )=x ( t ) x ( t )=
n=
x ( t ) ( t n T s )
x ( n T s) (tn T s )
n=
1
X ( f )=
( f n f s )
T s n=
31
2016-09-23
DR. MAHMOOD
of the sampled
waveform:
X ( f ) ( f n f s )=X (f n f s)
So that
X s ( f )=X ( f )X ( f ) =X ( f )[
1
( f n f s ) ]= T1 X (f n f s)
T s n=
s n=
f s=2 f m
fs
f s <2 f m ,
the
32
2016-09-23
DR. MAHMOOD
T s
1
sec
. The sampling rate is
2f m
f s=2 f m
f s=
1
Ts
is called Nyquist
rate.
Example: Find the Nyquist rate and Nyquist interval for the
following signals.
i-
m ( t )=
sin (500 t )
t
ii-
m ( t )=
1
cos ( 4000 t ) cos ( 1000 t )
2
33
2016-09-23
DR. MAHMOOD
Solution:
wt =500 t 2 f =500 f =250 Hz
i-
1
1
Nyquist interval 2 f max = 2 250 =2 msec .
ii-
1 1
{ cos ( 4000 t 1000 t ) +cos ( 4000 t+ 1000 t ) }
2 2
1
{cos ( 3000 t )+ cos ( 5000 t ) }
4
H. W:
Find the Nyquist interval and Nyquist rate for the following:
i-
1
cos ( 400 t ) .cos (200 t )
2
ii-
1
sint
Example:
waveform
[20+20sin(500t+30o]
is
to
be
sampled
DR. MAHMOOD
1
1
=
=6.283 msec .
f s (min ) 159.15
Number of sample in
1 sec=
1
=159.16 160 sample
6.283 msec
2- Source coding:
An
important
problem
in
communications
is
the
efficient
H (Y )= P ( y j ) log 2 P ( y j )
j=1
We have:
35
2016-09-23
DR. MAHMOOD
max [ H (Y ) ] =log 2 m
H (Y )
100
max [ H ( Y ) ]
x
P( j)l j bits/symbol
m
L=
j=1
H (Y )
100
L
Not that
i-
36
2016-09-23
DR. MAHMOOD
messages
, then
l1 l2 l3 ... ln LC
1-
bit/message
if
2-
n 2r
and
n 2,4,8,16,....
100%
bits/message
if
n 2r
which gives
less efficiency
Example
For ten equiprobable messages coded in a fixed length code then
p( xi )
1
10
and
and
LC Int[log 2 10] 1 4
bits
H (X )
log 2 10
100%
100% 83.048%
LC
4
p( xi )
1
8
and
LC log 2 8 3
bits and
3
100% 100%
3
p ( xi )
with a probability of
1
6
37
2016-09-23
with
n6
.
DR. MAHMOOD
LC Int[log 2 6] 1 3
bits/message
H (X )
log 2 6
100%
100% 86.165%
LC
3
while
c-
H ( X ) log 2 6
For
three
bits/message
bits/symbol
throws
n 6 6 6 216
ii-
then
bits/2-symbols
2 H(X )
100% 86.165%
LC
the
possible
messages
are
LC Int[log 2 216] 1 8
while
n 6 6 36
H ( X ) log 2 6
bits/message
bits/symbol
bits/3-symbols
3 H (X )
100% 96.936%
LC
l(a)=1
C(b)= 10
l(b)=2
C(c)= 11
l(c)=2
The major property that is usually required from any variablelength code is that of unique decodability. For example, the above
38
2016-09-23
DR. MAHMOOD
A0
B 01
C 11
D 00
39
2016-09-23
DR. MAHMOOD
DC
transmission was
uniquely decodable.
or
0011
AAC
2) Instantaneous decoding:
Example
Consider a 4 alphabet symbols with symbols represented by
binary digits as follows:
A0
B 10
C 110
D 111
This code can be instantaneously decoded since no complete
codeword is a prefix of a larger codeword. This is in contrast to the
previous example where
is a prefix of both
and
. This
A0
B 01
C 011
D 111
40
2016-09-23
DR. MAHMOOD
The code is identical to the previous example but the bits are time
reversed. It is still uniquely decodable but no longer instantaneous,
since early codewords are now prefixes of later ones.
Shannon Code
x1
For messages
p ( xn )
1)
2)
x2
x3
xn
p ( x1 )
with probabilities
p ( x2 )
p ( x3 )
then:
li log 2 p( xi )
if
1
p ( xi )
2
1
1 1 1
{ , , ,...}
2 4 8
p( xi )
li Int[ log 2 p( xi )] 1
if
i 1
Fi p ( xk )
1 i 0
k 1
Also define
then the codeword of
bits.
xi
Fi
consisting of
li
Ci Fi 2i
l
Ci
Fi
li
where
is the binary equivalent of
up to
bits. In encoding,
messages must be arranged in a decreasing order of probabilities.
Example
Develop the Shannon code for the following set of messages,
p( x) [0.3 0.2 0.15 0.12 0.1 0.08 0.05]
then find:
(a)
Code efficiency,
41
2016-09-23
DR. MAHMOOD
p ( 0)
(b)
at the encoder output.
Solution
xi
p ( xi )
li
Fi
Ci
0i
x1
0.3
00
x2
0.2
0.3
010
x3
0.15
0.5
100
x4
0.12
0.65
1010
x5
0.10
0.77
1100
x6
0.08
0.87
1101
x7
0.05
0.95
1111
0
0 2 0
0 . 2 0 2 0 . 4 0. 1 6 2 0. 3 2
(a) To find the code efficiency, we have
42
2016-09-23
DR. MAHMOOD
LC li p ( xi ) 3.1
i 1
bits/message.
7
H ( X ) p ( x i ) log 2 p ( x i ) 2.6029
i 1
(b)
bits/message.
H(X )
100% 83.965%
LC
p ( 0)
p ( 0)
0i p( xi )
i 1
LC
p(0) 0.603
Example
Repeat the previous example using ternary coding.
Solution
1)
2)
1 1 1
{ , , ,...}
3 9 27
p( xi )
li log3 p ( xi )
if
1
p( xi )
li Int[ log3 p ( xi )] 1
if
Ci Fi 3i
l
and
xi
p ( xi )
li
Fi
Ci
0i
x1
0.3
00
x2
0.2
0.3
02
x3
0.15
0.5
11
43
2016-09-23
DR. MAHMOOD
x4
0.12
0.65
12
x5
0.10
0.77
202
x6
0.08
0.87
212
x7
0.05
0.95
221
LC li p ( xi ) 2.23
i 1
ternary unit/message.
7
H ( X ) p( x i ) log3 p( x i ) 1.642
i 1
(b)
ternary unit/message.
H(X )
100% 73.632%
LC
p ( 0)
p ( 0)
0i p ( xi )
i 1
LC
p (0) 0.404
44
2016-09-23
DR. MAHMOOD
45
2016-09-23
DR. MAHMOOD