Transmitting and Gaining Data [electronic resource] : Rudolf Ahlswede’s Lectures on Information Theory 2 / by Rudolf Ahlswede ; edited by Alexander Ahlswede, Ingo Althöfer, Christian Deppe, Ulrich Tamm
- Ahlswede, Rudolf, 1938-
- Cham : Springer International Publishing : Imprint: Springer, 2015.
- Physical Description:
- XVII, 461 pages 4 illustrations : online resource
- Additional Creators:
- Ahlswede, Alexander, Althöfer, Ingo, Deppe, Christian, Tamm, Ulrich, and SpringerLink (Online service)
- Words and Introduction of the Editors -- Preface -- I.Transmitting Data -- 1.Special Channels -- 2.Algorithms for Computing Channel Capacities and Rate-distortion Functions -- 3.Shannon’s Model for Continuous Transmission -- 4.On Sliding-Block Codes -- 5.On λ-Capacities and Information Stability -- 6.Channels with Infinite Alphabets -- II.Gaining Data -- 7.Selected Topics of Information Theory and Mathematical Statistics -- 8.β-Biased Estimators in Data Compression -- III.Supplement -- Rudolf Ahlswede 1938-2010 -- Comments by Gerhard Kramer -- List of Notations -- Index -- Name Index.
- The calculation of channel capacities was one of Rudolf Ahlswede's specialties and is the main topic of this second volume of his Lectures on Information Theory. Here we find a detailed account of some very classical material from the early days of Information Theory, including developments from the USA, Russia, Hungary and which Ahlswede was probably in a unique position to describe the German school centered around his supervisor Konrad Jacobs. These lectures made an approach to a rigorous justification of the foundations of Information Theory. This is the second of several volumes documenting Rudolf Ahlswede's lectures on Information Theory. Each volume includes comments from an invited well-known expert. In the supplement to the present volume, Gerhard Kramer contributes his insights. Classical information processing concerns the main tasks of gaining knowledge and the storage, transmission and hiding of data. The first task is the prime goal of Statistics. For transmission and hiding of data, Shannon developed an impressive mathematical theory called Information Theory, which he based on probabilistic models. The theory largely involves the concept of codes with small error probabilities in spite of noise in the transmission, which is modeled by channels. The lectures presented in this work are suitable for graduate students in Mathematics, and also for those working in Theoretical Computer Science, Physics, and Electrical Engineering with a background in basic Mathematics. The lectures can be used as the basis for courses or to supplement courses in many ways. Ph.D. students will also find research problems, often with conjectures, that offer potential subjects for a thesis. More advanced researchers may find questions which form the basis of entire research programs.
- Digital File Characteristics:
- text file PDF
- AVAILABLE ONLINE TO AUTHORIZED PSU USERS.
- Part Of:
- Springer eBooks
View MARC record | catkey: 14086578