4.6 Article

The Double-Sided Information Bottleneck Function

相关参考文献

注意:仅列出部分参考文献,下载原文获取全部文献信息。
Article Mathematics, Applied

Distributed information-theoretic clustering

Georg Pichler et al.

Summary: This paper investigates a novel multi-terminal source coding setup motivated by the biclustering problem. By improving cardinality bounds, the special case of a binary symmetric source and the gap between the inner and outer bounds in this case are thoroughly studied. Additionally, a multiple description extension of the CEO problem with mutual information constraint is investigated, which surprisingly allows for a tight single-letter characterization of the achievable region.

INFORMATION AND INFERENCE-A JOURNAL OF THE IMA (2022)

Article Computer Science, Artificial Intelligence

Understanding Convolutional Neural Networks With Information Theory: An Initial Exploration

Shujian Yu et al.

Summary: A novel functional estimator for Renyi's alpha-entropy and its multivariate extension has been proposed recently, which allows for direct measurement of information flow in convolutional neural networks (CNNs). Additionally, three quantities were developed to analyze synergy and redundancy in convolutional layer representations, validating two fundamental data processing inequalities and revealing more inner properties regarding CNN training.

IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS (2021)

Article Computer Science, Information Systems

Information-Distilling Quantizers

Alankrita Bhatt et al.

Summary: This paper explores the design of a scalar quantizer for maximizing mutual information between dependent random variables X and Y, connected to the log-loss distortion criterion. It is shown that a constant fraction of mutual information can be preserved with a certain number of quantization levels for binary X, and for larger finite alphabets a fraction of mutual information can be preserved with a certain number of quantization levels.

IEEE TRANSACTIONS ON INFORMATION THEORY (2021)

Article Computer Science, Artificial Intelligence

Distributed Variational Representation Learning

Inaki Estella-Aguerri et al.

Summary: The problem of distributed representation learning is investigated from information-theoretic grounds, focusing on a generalization of Tishby's Information Bottleneck method. Both discrete and vector Gaussian data models are studied, with the development of a variational bound and two algorithms for computing the optimal complexity-relevance tradeoff. Numerical results support the efficiency of the approaches and algorithms proposed.

IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE (2021)

Editorial Material Physics, Multidisciplinary

On the Information Bottleneck Problems: Models, Connections, Applications and Information Theoretic Views

Abdellatif Zaidi et al.

ENTROPY (2020)

Article Computer Science, Information Systems

The Capacity Achieving Distribution for the Amplitude Constrained Additive Gaussian Channel: An Upper Bound on the Number of Mass Points

Alex Dytso et al.

IEEE TRANSACTIONS ON INFORMATION THEORY (2020)

Article Computer Science, Information Systems

Vector Gaussian CEO Problem Under Logarithmic Loss and Applications

Yigit Ugur et al.

IEEE TRANSACTIONS ON INFORMATION THEORY (2020)

Article Computer Science, Artificial Intelligence

Learning Representations for Neural Network-Based Classification Using the Information Bottleneck Principle

Rana Ali Amjad et al.

IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE (2020)

Article Engineering, Electrical & Electronic

Decoding Rate-Compatible 5G-LDPC Codes With Coarse Quantization Using the Information Bottleneck Method

Maximilian Stark et al.

IEEE OPEN JOURNAL OF THE COMMUNICATIONS SOCIETY (2020)

Article Computer Science, Information Systems

Collaborative Information Bottleneck

Matias Vera et al.

IEEE TRANSACTIONS ON INFORMATION THEORY (2019)

Article Mechanics

On the information bottleneck theory of deep learning

Andrew M. Saxe et al.

JOURNAL OF STATISTICAL MECHANICS-THEORY AND EXPERIMENT (2019)

Article Computer Science, Artificial Intelligence

Coarsely Quantized Decoding and Construction of Polar Codes Using the Information Bottleneck Method

Syed Aizaz Ali Shah et al.

ALGORITHMS (2019)

Article Statistics & Probability

DICTATOR FUNCTIONS MAXIMIZE MUTUAL INFORMATION

Georg Pichler et al.

ANNALS OF APPLIED PROBABILITY (2018)

Article Computer Science, Information Systems

Which Boolean Functions Maximize Mutual Information on Noisy Inputs?

Thomas A. Courtade et al.

IEEE TRANSACTIONS ON INFORMATION THEORY (2014)

Article Computer Science, Information Systems

Multiterminal Source Coding Under Logarithmic Loss

Thomas A. Courtade et al.

IEEE TRANSACTIONS ON INFORMATION THEORY (2014)

Article Computer Science, Information Systems

On MMSE Crossing Properties and Implications in Parallel Vector Gaussian Channels

Ronit Bustin et al.

IEEE TRANSACTIONS ON INFORMATION THEORY (2013)

Article Computer Science, Theory & Methods

Transition points in the capacity-achieving distribution for the peak-power limited AWGN and free-space optical intensity channels

N. Sharma et al.

PROBLEMS OF INFORMATION TRANSMISSION (2010)

Article Computer Science, Information Systems

Coding and Common Reconstruction

Yossef Steinberg

IEEE TRANSACTIONS ON INFORMATION THEORY (2009)

Article Computer Science, Information Systems

Communication via decentralized processing

Amichai Sanderovich et al.

IEEE TRANSACTIONS ON INFORMATION THEORY (2008)

Article Computer Science, Information Systems

Achievable rates for pattern recognition

M. Brandon Westover et al.

IEEE TRANSACTIONS ON INFORMATION THEORY (2008)

Article Computer Science, Interdisciplinary Applications

Information Combining

Ingmar Land et al.

FOUNDATIONS AND TRENDS IN COMMUNICATIONS AND INFORMATION THEORY (2006)

Article Computer Science, Information Systems

Extremes of information combining

I Sutskover et al.

IEEE TRANSACTIONS ON INFORMATION THEORY (2005)