Di Digi gital tal Co Comm mmuni unication cation Sy Syst - - PowerPoint PPT Presentation

di digi gital tal co comm mmuni unication cation sy syst
SMART_READER_LITE
LIVE PREVIEW

Di Digi gital tal Co Comm mmuni unication cation Sy Syst - - PowerPoint PPT Presentation

Di Digi gital tal Co Comm mmuni unication cation Sy Syst stem ems ECS 452 EC Asst. Prof. Dr. Prapun Suksompong prapun@siit.tu.ac.th Channel Capacity Office Hours: Rangsit Library: Tuesday 16:20-17:20 BKD3601-7: Thursday


slide-1
SLIDE 1
  • Asst. Prof. Dr. Prapun Suksompong

prapun@siit.tu.ac.th

Channel Capacity

1

Di Digi gital tal Co Comm mmuni unication cation Sy Syst stem ems

EC ECS 452

Office Hours: Rangsit Library: Tuesday 16:20-17:20 BKD3601-7: Thursday 16:00-17:00

slide-2
SLIDE 2

Operational Meaning of Capacity

2

slide-3
SLIDE 3

Reliable Communication

3

 Reliable communication means arbitrary small error

probability can be achieved.

 This seems to be an impossible goal.

 If the channel introduces errors, how can one correct them all?

 Any correction process is also subject to error, ad infinitum.

 Operational Channel capacity C = the maximum rate at

which reliable communication over a channel is possible.  

Q y x X Y

slide-4
SLIDE 4

Coding

4

 or Encoding  or Channel Encoding  Introduce redundancy so that even if some of the information

is lost or corrupted, it will still be possible to recover the message at the receiver.

slide-5
SLIDE 5

Repetition Code (k = 1)

5

 The most obvious coding scheme is to repeat information.  For example,

 to send a 1, we send 11111, and  to send a 0, we send 00000.

 This scheme uses five symbols to send 1 bit, and therefore has a rate of

1/5 bit per symbol.

 If this code is used on a binary symmetric channel, the ML decoding

rule (which is optimal when the 0s and 1s are equiprobable), is equivalent to taking the majority vote of each block of five received bits.

 If three or more bits are 1, we decode the block as a 1;  otherwise, we decode it as 0.

 By using longer repetition codes, we can achieve an arbitrarily low

probability of error.

 But the rate of the code also goes to zero with (larger) block length, so

even though the code is “simple,” it is really not a very useful code.

slide-6
SLIDE 6

Repetition Code over BSC

6

0.05 0.1 0.15 0.2 0.25 0.3 0.35 0.4 0.45 0.5 0.05 0.1 0.15 0.2 0.25 0.3 0.35 0.4 0.45 0.5

n = 15 n = 5 n = 1 n = 25 p

 

P

slide-7
SLIDE 7

Parity Bit or Check Bit

7

 In mathematics, parity refers to the evenness or oddness of an integer  Here, parity refers to the evenness or oddness of the # 1’s within a given set of

bits.

 It can be calculated via an XOR sum of the bits, yielding 0 for even parity and 1

for odd parity.

 Ex.

 Even parity: 0110, 011011  Odd Parity: 0111, 011010

 A parity bit, or check bit, is a bit added to the end of the k information bit.  𝑜 = 𝑙 + 1  There are two variants of parity bits: even parity bit and odd parity bit.  Even parity bit: Choose the nth bit so that the number of 1’s in the block

is even.

 Ex. k = 5

 100001; 101000; 111111; 010111

1 2 3

, 1,2, , , 1

i i k

B i k X B B B B i k n            

slide-8
SLIDE 8

Parity Bit or Check Bit

8

 Used as the simplest form of error detecting code.  Does not detect an even number of errors  Does not give any information about how to correct the

errors that occur.

 Generalization: Parity Check Codes

 We can extend the idea of parity check bit

 to allow for multiple parity check bits and  to allow the parity checks to depend on various subsets of the information

bits.  The Hamming code is an example of a parity check code.

slide-9
SLIDE 9

NOISY CHANNEL CODING THEOREM

9

 [SHANNON, 1948] 1.

Reliable communication over a (discrete memoryless) channel is possible if the communication rate R satisfies R < C, where C is the channel capacity.

 In particular, for any R < C, there exist codes (encoders and

decoders) with sufficiently large n such that

2.

At rates higher than capacity, reliable communication is impossible.

 

 

ˆ 2

E R n

P P W W

 

      

 Positive function of R for R < C  Completely determined by the channel characteristics

slide-10
SLIDE 10

NOISY CHANNEL CODING THEOREM

10

 Express the limit to reliable communication  Provides a yardstick to measure the performance of

communication systems.

 A system performing near capacity is a near optimal system and

does not have much room for improvement.

 On the other hand a system operating far from this fundamental

bound can be improved (mainly through coding techniques).

slide-11
SLIDE 11

Shannon’s nonconstructive proof

11

 Shannon introduces a method of proof called random coding.  Instead of looking for the best possible coding scheme and

analyzing its performance, which is a difficult task,

 all possible coding schemes are considered

 by generating the code randomly with appropriate distribution

 and the performance of the system is averaged over them.  Then it is proved that if R < C, the average error probability tends to

zero.

 This proves that

 as long as R < C,  at any arbitrarily small (but still positive) probability of error,  one can find (there exist) at least one code (with sufficiently

long block length n) that performs better than the specified probability of error.

slide-12
SLIDE 12

Shannon’s nonconstructive proof

12

 If we used the scheme suggested and generate a code at random,

the code constructed is likely to be good for long block lengths.

 No structure in the code. Very difficult to decode

 In addition to achieving low probabilities of error, useful codes should

be “simple,” so that they can be encoded and decoded efficiently.

 Hence the theorem does not provide a practical coding scheme.  Since Shannon’s paper, a variety of techniques have been used to

construct good error correcting codes.

 The entire field of coding theory has been developed during this

search.

 Turbo codes have come close to achieving capacity for Gaussian

channels.

slide-13
SLIDE 13

Deriving the Q Matrix

13

slide-14
SLIDE 14

Probability Calculation: 1-D Noise

14

(t)

 

i

s

 

j

s

ai bi aj bj Decision region for

 

j

s

Decision region for

 

i

s

 

b a a b P a N b Q Q                                    

 

2

0, N 

 

 

                       

1 ˆ

i i i i i i i i i i i i i i i i i i i j j j j i i j j

P s P a R b S s P a s N b s a s b s s a b s Q Q Q Q P W j W i P a R b S s P a s N b s a s b s Q Q                                                                                                   

slide-15
SLIDE 15
  • Ex. Standard 3-PAM

15

 

                       

1 2 3 2 2 2 2 1 2 2 2

i

s d d d d Q d d d d d d Q Q Q Q Q d Q Q d d Q                                                                                                                                                              

                       

2 2 2 2 2 3 2 d d Q Q Q d d d d Q Q Q Q d d d d Q d d Q d                                                                                                                                                                           

(t)

 

1

s d  

 

2

s 

 

3

s d 

2 d 2 d 

   

ˆ

i i j j

a s b s P W j W i Q Q                            

i j

slide-16
SLIDE 16
  • Ex. Standard 3-PAM

16

1 2 3 3 3 1 1 2 2 2 2 2 1 2 2 2 2 3 3 3 1 2 2 2 2 d d d d Q Q Q Q d d d d Q Q Q Q d d d d Q Q Q Q                                                                                                   

1 1 1 3 3 1 1 1 3 3 1

1 2 3 1 1 2 1 2 3 1 q q q q q q q q q q q     

=

Transition probabilities i j i j

ˆ P W j W i      

1 3

2 3 2 d q Q d q Q                

slide-17
SLIDE 17

Probability Calculation: 2-D Noise

17  

i

s

 

j

s

Decision region for

 

j

s

 

i.i.d. 2

0,

i

N 

𝜚1 𝜚2

:

j

a b c d

ˆ P W j W i      

                   

1 2 1 1 1 2 2 2 1 1 2 2

,c

i j i i i i i i i i i

P R S s P a R b R d S s P a s N b s P c s N d s a s b s c s d s Q Q Q Q                                                                                 

slide-18
SLIDE 18
  • Ex. Standard QPSK

18

d

1

2

 

1

s

 

2

s

 

3

s

 

4

s

     

2

2 2 2 2 ˆ 1 1 1 1 1 2 2 ˆ 1 3 d d d d P W W Q Q Q Q d d Q Q q P W W Q                                                                                                                                                      

   

2 2 2 2 1 1 2 2 2 ˆ 1 4 d d d d Q Q Q d d Q Q q q d P W W Q                                                                                                                                                             

2

2 2 2 2 2 d d d Q Q Q d d Q Q q                                                                                                                   

                       

2 2 2 2 2 2 2 2

1 2 3 4 1 1 1 1 2 1 1 1 3 1 1 1 4 1 1 1 q q q q q q q q q q q q q q q q q q q q q q q q            

i j

ˆ P W j W i      

2 d q Q        

slide-19
SLIDE 19

MATLAB Calculation

19

 Q_Matrix = Q_ML(SS,EbN0dB,n)  Capacity_BPSK_Example

slide-20
SLIDE 20

Ex: Capacity for BPSK

20

  • 20
  • 15
  • 10
  • 5

5 10 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 Eb/N0 [dB] C BPSK (sim) BPSK (theoretical)

slide-21
SLIDE 21
slide-22
SLIDE 22

I(X;Y) for continuous X and Y

22

       

   

 

   

 

   

   

   

 

2 2 2 2 2 2

log log log log ( ; ) log log

X X X x Y X X Y X Y X x y Y X Y Y X Y X x y Y

H X p X p x p x H Y X p p x p p I X Y H Y H Y X p p Y p p x p p Y X y x y x Y X y x y x y                            

    

  

       

   

 

   

 

   

   

   

 

2 2 2 2 2 2

log log log log ( ; ) log log

X X X x Y X X Y X Y X Y X Y Y X X Y X Y

h X f X f x f x h Y X f f x f f dydx I X Y h Y h Y X f f Y f f x f dydx Y X y x y x Y X y x f y y x                            

    

  

Discrete X and Y Continuous X and Y

   

 

Y X Y X

f y f x f d y x x  

   

 

Y Y X x

p p x p y y x  

slide-23
SLIDE 23

Capacity for additive Gaussian noise channel

23

 Suppose

where the additive noise is a zero- mean Gaussian RV with variance .

 Input Power Constraint: In addition, it is usually assumed

that the channel input satisfies a power constraint of the form

 Capacity:

 The input pdf that achieves this capacity is a zero-mean Gaussian

pdf with variance P.

2

X P      

2 2

1 log 1 2 P C         

[bits per transmission] [bits per channel use]

slide-24
SLIDE 24

Capacity for AWGN Waveform Channel

24

 Assume

 Band-Limited Channel: The channel has a given

bandwidth W.

 Can use only the frequencies in the range , .

 Input Power Constraint:  AWGN with PSD N0/2

 Capacity:  This is the celebrated equation for the capacity of a band-

limited AWGN channel with input power constraint derived by Shannon in 1948.

 

2

X t P      

2

log 1 P C W N W        

[bps]

slide-25
SLIDE 25