4.6 Article

Information Bottleneck Analysis by a Conditional Mutual Information Bound

期刊

ENTROPY
卷 23, 期 8, 页码 -

出版社

MDPI
DOI: 10.3390/e23080974

关键词

conditional mutual information; information bottleneck; deep learning

资金

  1. Shimadzu Science Foundation
  2. G-7 Scholarship Foundation
  3. Uehara Memorial Foundation
  4. JSPS KAKENHI [16K00228, 18KK0308]
  5. Grants-in-Aid for Scientific Research [16K00228, 18KK0308] Funding Source: KAKEN

向作者/读者索取更多资源

Task-nuisance decomposition explains why the information bottleneck loss is a suitable objective for supervised learning. By demonstrating that conditional mutual information provides an alternative upper bound for I(z;n), even if z is not a sufficient representation of x, we extend this framework.
Task-nuisance decomposition describes why the information bottleneck loss I(z; x) - beta I(z; y) is a suitable objective for supervised learning. The true category y is predicted for input x using latent variables z. When n is a nuisance independent from y, I(z; n) can be decreased by reducing I (z; x) since the latter upper bounds the former. We extend this framework by demonstrating that conditional mutual information I(z; x vertical bar y) provides an alternative upper bound for I(z; n). This bound is applicable even if z is not a sufficient representation of x, that is, I(z; y) not equal I(x; y). We used mutual information neural estimation (MINE) to estimate I (z; x vertical bar y). Experiments demonstrated that I(z; x vertical bar y) is smaller than I(z; x) for layers closer to the input, matching the claim that the former is a tighter bound than the latter. Because of this difference, the information plane differs when I(z; x vertical bar y) is used instead of I(z; x).

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据