Guest editors Alicia Reynolds    Benjamin Carter    Maria Rodriguez
Deadline: May 6, 2024 (Extended)

Modern Rate-Distortion Theory (RDT) stands at the intersection of information theory, signal processing, and machine learning, offering a profound understanding of the tradeoff between data compression and reconstruction fidelity.This special issue aims to present the latest advancements in RDT, ranging from theoretical developments to practical applications across diverse domains. Topics include novel formulations of the rate-distortion tradeoff, deep learning approaches for optimization, applications in image and video compression, and extensions to non-standard data sources. By bringing together cutting-edge research and innovative methodologies, this special issue aims to shape the future landscape of RDT and its applications.

Guest editors Brian Kurkoski    Random Number    Vector Generator
Deadline: Aug 1, 2024 (Extended)

The CFP Abstract. This section explains the basics of DNA storage systems. DNA storage system is a bio-based storage system. Thanks to tremendous research, we can manipulate DNA sequences. More precisely, it is now possible to be able to synthesize nucleotides freely and to read them. Here, the reading of nucleotides is called sequencing, and making DNA is called synthesizing. DNA storage systems uses these techniques and DNA can be seen as a medium that retains information, that is storage. Nucleotides are composed of four types of nucleobases: adenine (A), cytosine (C), guanine (G), and thymine (T). The DNA storage system retains information as sequences of nucleotides. The number of letters to handle information is the one of the biggest differences with other storage system that uses binary messages.

Guest editors Jun Chen    Aaron B. Wagner
Deadline: Dec 15, 2023 (Extended)

This special issue of the IEEE Journal on Selected Areas in Information Theory is dedicated to the memory of Toby Berger, one of the most important information theorists of our time, who passed away in 2022 at the age of 81. He made foundational contributions to a wide range of areas in information theory, including rate-distortion theory, network information theory, quantum information theory, and bio-information theory. He also left a deep imprint on diverse fields in applied mathematics and theoretical engineering, such as Markov random fields, group testing, multiple access theory, and detection and estimation. Well known for his technical brilliance, he tackled many challenging problems, but above all, it is his pursuit of elegance in research and writing that shines throughout his work. The goal of this special issue is to celebrate Toby Berger’s lasting legacy and his impact on information theory and beyond. Original research papers on topics within the realm of his scientific investigations and their “offspring”, as well as expository articles that survey his pioneering contributions and their modern developments, are invited.

Guest editors Lalitha Sankar    Oliver Kosut
Deadline: Oct 29, 2023 (Extended)

Over the past decade, machine learning (ML), that is the process of enabling computing systems to take data and churn out decisions, has been enabling tremendously exciting technologies. Such technologies can assist humans in making a variety of decisions by processing complex data to identify patterns, detect anomalies, and make inferences. At the same time, these automated decision-making systems raise questions about security and privacy of user data that drive ML, fairness of the decisions, and reliability of automated systems to make complex decisions that can affect humans in significant ways. In short, how can ML models be deployed in a responsible and trustworthy manner that ensures fair and reliable decision-making? This requires ensuring that the entire ML pipeline assures security, reliability, robustness, fairness, and privacy. Information theory can shed light on each of these challenges by providing a rigorous framework to not only quantify these desirata but also rigorously evaluate and provide assurances. From its beginnings, information theory has been devoted to a theoretical understanding of the limits of engineered systems. As such, it is a vital tool in guiding machine learning advances.

We invite previously unpublished papers that contribute to the fundamentals, as well as the applications of information- and learning-theoretic methods for secure, robust, reliable, fair, private, and trustworthy machine learning. Exploration of such techniques to practical systems is also relevant.

Guest editors Deniz Gündüz    Victoria Kostina    Petar Popovski    Yin Sun    Aylin Yener    Sennur Ulukus    Tara Javidi
Deadline: Mar 17, 2023 (Extended)

To support the fast growth of IoT and cyber physical systems, as well as the advent of 6G, there is a need for communication and networking models that enable more efficient modes for machine-type communications. This calls for a departure from the assumptions of classical communication theoretic problem formulations as well as the traditional network layers. This new communication paradigm is referred to as goal or task oriented communication, or in a broader sense, is part of the emerging area of semantic communications. Over the past decade, there have been a number of approaches towards novel performance metrics, starting from measures of timeliness such as the Age of Information (AoI), Query Age of Information (QAoI), to those that capture goal oriented nature, tracking or control performance such as Quality of Information (QoI), Value of Information (VoI) and Age of Incorrect Information (AoII), moving toward to more sophisticated end-to-end distortion metrics (e.g. MSE), ML performance, or human perception of the reproduced data, and the application of finite-blocklength information theory in the context of the remote monitoring of stochastic processes, and real-time control. We invite original papers that contribute to the fundamentals, as well as the applications of semantic metrics, and protocols that use them, in IoT or automation scenarios.