Joint State Sensing and Communication: Theory and Applications
Mari Kobayashi (CentraleSupélec)
Abstract
We consider a communication setup where transmitters wish to simultaneously sense network states and convey messages to intended receivers. The scenario is motivated by joint radar and vehicular communications where the radar and data applications share the same bandwidth. In the first part of the talk, I review well-known results from information theory, including channels with state and feedback. Then, I present recent results on the tradeoff between the capacity and the distortion for the case of a memoryless discrete channel with i.i.d. state sequences. We demonstrate through some examples the benefits of joint sensing and communication compared to a resource sharing approach. In the second part of the talk, I provide some application examples on joint radar and communication using multi-carrier transmission (OFDM and OTFS). I conclude the tutorial by providing some challenges towards joint radar and vehicular communications.
Biography
Mari Kobayashi received the B.E. degree in electrical engineering from Keio University, Yokohama, Japan, in 1999, and the M.S. degree in mobile radio and the Ph.D. degree from École Nationale Supérieure des Télécommunications, Paris, France, in 2000 and 2005, respectively. From November 2005 to March 2007, she was a postdoctoral researcher at the Centre Tecnològic de Telecomunicacions de Catalunya, Barcelona, Spain. In May 2007, she joined the Telecommunications department at CentraleSupélec, Gif-sur-Yvette, France, where she is now a professor. She is the recipient of the Newcom++ Best Paper Award in 2010, and the Joint Information Theory/Communications Society Best Paper Award in 2011. Since September 2017, she is on a sabbatical leave at Technical University of Munich (TUM) as an Alexander von Humboldt Experienced Research Fellow.
A tutorial on polar codes
Ido Tal (Technion)
Abstract
In this short tutorial on polar codes, we will first introduce the polar transform as a general operation applied on a process (X_i,Y_i) to form a new process (F_i,G_i). We will explain why the new process is "polarized". We will then show how the polarization phenomenon can be used in many important settings, primary of which is channel coding. Specifically, we will first describe how to code for a memoryless symmetric channel, then a memoryless channel which is not symmetric, and finally a channel that is neither memoryless nor symmetric.
Biography
Ido Tal was born in Haifa, Israel, in 1975. He received the B.Sc., M.Sc., and Ph.D. degrees in computer science from Technion-Israel Institute of Technology, Haifa, Israel, in 1998, 2003 and 2009, respectively. During 2010–2012 he was a postdoctoral scholar at the University of California at San Diego. In 2012 he joined the Electrical Engineering Department at Technion. His research interests include constrained coding and error-control coding. He received the 91Թ Joint Communications Society/Information Theory Society Paper Award (jointly with Alexander Vardy) for the year 2017.
Information-Theoretic Security: From Information-Theoretic Primitives to Algorithms
Matthieu Bloch (Georgia Institute of Technology)
Abstract
While the foundations of information-theoretic security can be traced back to the works of Shannon (1949), Wyner (1975), Maurer (1993), Ahslwede and Csiszár (1993), the past decade of research on the topic has enabled conceptual simplifications and generalizations, spanning both information and coding theory. This three-part tutorial will leverage these recent advances to provide a modern perspective on the topic. In the first part, we will develop four information-theoretic primitives that will prove central to the study of information theoretic security. In the second part, we will introduce canonical information-theoretic security models and combine information-theoretic primitives to establish their capacity. In the last part, we will show how to translate the insights from information-theory to actual codes and algorithms with manageable complexity.
Biography
Matthieu R. Bloch is an Associate Professor in the School of Electrical and Computer Engineering, the Georgia Institute of Technology. He received the Engineering degree from Supélec, Gif-sur-Yvette, France, the M.S. degree in Electrical Engineering from the Georgia Institute of Technology, Atlanta, in 2003, the Ph.D. degree in Engineering Science from the Université de Franche-Comté, Besançon, France, in 2006, and the Ph.D. degree in Electrical Engineering from the Georgia Institute of Technology in 2008. In 2008-2009, he was a postdoctoral research associate at the University of Notre Dame, South Bend, IN. Since July 2009, Dr. Bloch has been on the faculty of the School of Electrical and Computer Engineering, and from 2009 to 2013 Dr. Bloch was based at Georgia Tech Lorraine. His research interests are in the areas of information theory, error-control coding, wireless communications, and cryptography. Dr. Bloch has served on the organizing committee of several international conferences; he was the chair of the Online Committee of the 91Թ Information Theory Society from 2011 to 2014, and he has been on the Board of Governors of the 91Թ Information Theory Society and an Associate Editor for the 91Թ Transactions on Information since 2016. He is the co- recipient of the 91Թ Communications Society and 91Թ Information Theory Society 2011 Joint Paper Award and the co-author of the textbook Physical- Layer Security: From Information Theory to Security Engineering published by Cambridge University Press.
Joint Source and Channel Coding: Fundamental Bounds and Connections to Machine Learning
Deniz Gündüz (Imperial College London)
Abstract
Machine learning and communications are intrinsically connected. The fundamental problem of communications, as stated by Shannon, “is that of reproducing at one point either exactly or approximately a message selected at another point,” can be considered as a classification problem. With this connection in mind, I will focus on the fundamental joint source-channel coding (JSCC) problem; starting from the basic information theoretic performance bounds. I will then move to practical implementation of JSCC for wireless image transmission. I will first show some “surprising” performance results with uncoded "analog” transmission, which will then be used to motivate unsupervised learning for JSCC. I will present a "deep joint source-channel encoder” architecture, which behaves similarly to analog transmission, and not only improves upon state-of-the-art digital transmission schemes, but also achieves graceful degradation with channel quality.
In the second part of the lecture, I will talk about another connection between information theory and machine learning, focusing on distributed machine learning, particularly targeting wireless edge networks, and show how ideas from coding theory can help improve the performance of distributed learning across unreliable servers. I will introduce both coded and uncoded distributed stochastic gradient descent algorithms, and study the trade-off between their average computation time and the communication load. Finally, I will introduce the novel concept of "over-the-air stochastic gradient descent" for wireless edge learning, and show that it significantly improves the efficiency of machine learning across bandwidth and power limited wireless devices compared to the standard digital approach that separates computation and communication. This will close the circle, illustrating how JSCC can be used to speed up machine learning at the wireless edge.
Biography
Deniz Gündüz received the B.S. degree from METU, Turkey in 2002, and the M.S. and Ph.D. degrees from NYU Polytechnic School of Engineering (formerly Polytechnic University) in 2004 and 2007, respectively. After his PhD, he served as a postdoctoral research associate at Princeton University, and as a consulting assistant professor at Stanford University. He was a research associate at CTTC in Barcelona, Spain until September 2012, when he joined the Electrical and Electronic Engineering Department of Imperial College London, UK, where he is currently a Reader (Associate Professor) in information theory and communications, and leads the Information Processing and Communications Laboratory (IPC-Lab).
His research interests lie in the areas of communications and information theory, machine learning, and privacy. Dr. Gündüz is an Editor of the 91Թ Transactions on Green Communications and Networking, and a Guest Editor of the 91Թ Journal on Selected Areas in Communications, Special Issue on Machine Learning in Wireless Communication. He served as an Editor of the 91Թ Transactions on Communications from 2013 until 2018. He is the recipient of the 91Թ Communications Society - Communication Theory Technical Committee (CTTC) Early Achievement Award in 2017, a Starting Grant of the European Research Council (ERC) in 2016, 91Թ Communications Society Best Young Researcher Award for the Europe, Middle East, and Africa Region in 2014, Best Paper Award at the 2016 91Թ WCNC, and the Best Student Paper Awards at the 2018 91Թ WCNC and 2007 ISIT. He is serving/ served as the General Co-chair of the 2019 London Symposium on Information Theory, 2016 91Թ Information Theory Workshop, and the 2012 European School of Information Theory.
Sequential Acquisition of Information: From Active Hypothesis Testing to Active Learning to Empirical Function Optimization
Tara Javidi (University of California, San Diego)
Abstract
This lecture explores an often overlooked connection between the problems of communications with feedback in information theory and that of active hypothesis testing in statistics. This connection, we argue, has significant implications for next generation machine learning algorithms where data is collected actively and/or by cooperative yet local agents.
Consider a decision maker who is responsible to actively and dynamically collect data/samples so as to enhance the information about an underlying phenomena of interest wh