CHF130.00
Download est disponible immédiatement
Fundamentals of Convolutional Coding, Second Edition, regarded as a bible of convolutional coding brings you a clear and comprehensive discussion of the basic principles of this field
Two new chapters on low-density parity-check (LDPC) convolutional codes and iterative coding
Viterbi, BCJR, BEAST, list, and sequential decoding of convolutional codes
Distance properties of convolutional codes
Includes a downloadable solutions manual
Auteur
Rolf Johannesson is Professor Emeritus of Information Theory at Lund University, Sweden, and a Fellow of the IEEE. He was awarded the honor of Professor, honoris causa, from the Institute for Information Transmission Problems, Russian Academy of Sciences, and elected member of the Royal Swedish Academy of Engineering Sciences. Dr. Johannesson's research interests include information theory, coding theory, and cryptography.
Kamil Sh. Zigangirov is Professor Emeritus of Telecommunication Theory at Lund University, Sweden, and a Fellow of the IEEE. He is widely published in the areas of information theory, coding theory, mathematical statistics, and detection theory. Dr. Zigangirov is the inventor of the stack algorithm for sequential decoding and the co-inventor of the LDPC convolutional codes.
Texte du rabat
Fundamentals of Convolutional Coding, Second Edition, regarded as a bible of convolutional coding brings you a clear and comprehensive discussion of the basic principles of this field.
This edition has been expanded to reflect the developments in modern coding theory, including new chapters on low-density parity-check convolutional codes and turbo codes. Since these types of codes are now appearing in industry standards, application engineers and scientists will also find this book essential to obtaining a basic understanding of the theory behind these new techniques. Written by two leading authorities in coding and information theory, it is unmatched in the field for its accessible analysis of the structural properties of convolutional encoders.
Other essentials covered include:
Contenu
Preface xi
Acknowledgement xiv
1 Introduction 1
1.1 Why error control? 1
1.2 Block codesa primer 8
1.3 Codes on graphs 21
1.4 A first encounter with convolutional codes 28
1.5 Block codes versus convolutional codes 35
1.6 Capacity limits and potential coding gain revisited 36
1.7 Comments 39
Problems 41
2 Convolutional encodersStructural properties 49
2.1 Convolutional codes and their encoders 49
2.2 The Smith form of polynomial convolutional generator matrices 58
2.3 Encoder inverses 67
2.4 Encoder and code equivalences 76
2.5 Basic encoding matrices 79
2.6 Minimalbasic encoding matrices 82
2.7 Minimal encoding matrices and minimal encoders 90
2.8 Canonical encoding matrices* 109
2.9 Minimality via the invariantfactor theorem* 127
2.10 Syndrome formers and dual encoders 131
2.11 Systematic convolutional encoders 139
2.12 Some properties of generator matricesan overview 150
2.13 Comments 150
Problems 152
3 Distance properties of convolutional codes 161
3.1 Distance measuresa first encounter 161
3.2 Active distances 171
3.3 Properties of convolutional codes via the active distances 179
3.4 Lower bound on the distance profile 181
3.5 Upper bounds on the free distance 186
3.6 Timevarying convolutional codes 191
3.7 Lower bound on the free distance 195
3.8 Lower bounds on the active distances* 200
3.9 Distances of cascaded concatenated codes* 207
3.10 Path enumerators 213
3.11 Comments 220
Problems 221
4 Decoding of convolutional codes 225
4.1 The Viterbi algorithm revisited 226
4.2 Error bounds for timeinvariant convolutional codes 235
4.3 Tighter error bounds for timeinvariant convolutional codes 250
4.4 Exact bit error probability for Viterbi decoding 255
4.5 The BCJR algorithm for APP decoding 271
4.6 The oneway algorithm for APP decoding 283
4.7 A simple upper bound on the bit error probability for extremely noisy channels 288
4.8 Tailbiting trellises 293
4.9 Decoding of tailbiting codes 302
4.10 BEAST decoding of tailbiting codes 308
4.11 Comments 323
Problems 324
5 Random ensemble bounds for decoding error probability 333
5.1 Upper bounds on the output error burst lengths 333
5.2 Bounds for periodically timevarying convolutional codes 345
5.3 Lower error probability bounds for convolutional codes 355
5.4 General bounds for timevarying convolutional codes 363
5.5 Bounds for finite backsearch limits 375
5.6 Quantization of channel outputs 379
5.7 Comments 384
Problems 384
6 List decoding 387
6.1 List decoding algorithms 388
6.2 List decodingperformance 391
6.3 The list minimum weight 397
6.4 Upper bounds on the probability of correct path loss 407
6.5 Lower bound on the probability of correct path loss 416
6.6 Correct path loss for timeinvariant convolutional codes 418
6.7 Comments 422
Problems 423
7 Sequential decoding 425
7.1 The Fano metric 426
7.2 The stack algorithm 431
7.3 The Fano algorithm 433
7.4 The Creeper algorithm* 436
7.5 Simulations 448
7.6 Computational analysis of the stack algorithm 450
7.7 Error probability analysis of the stack algorithm 460
7.8 Analysis of the Fano algorithm 471
7.9 Analysis of Creeper* 477
7.10 Comments 480
Problems 481
8 Lowdensity paritycheck codes 485
8.1 LDPC block codes 486
8.2 LDPC convolutional codes 496 8.3 Blo...