4.5 Article

A Wiener Causality Defined by Divergence

Journal

NEURAL PROCESSING LETTERS
Volume 53, Issue 3, Pages 1773-1794

Publisher

SPRINGER
DOI: 10.1007/s11063-019-10187-6

Keywords

Granger causality; Time series; Relative entropy causality; Transfer entropy; Bregman divergence

Funding

  1. National Natural Sciences Foundation of China [61673119]
  2. Key Program of the National Science Foundation of China [91630314]
  3. 111 Project [B18015]
  4. key project of Shanghai Science and Technology [16JC1420402]
  5. Shanghai Municipal Science and Technology Major Project [2018SHZDZX01]
  6. ZJ LAB

Ask authors/readers for more resources

The study focuses on the fundamental task of discovering causal relationships in investigating the dynamics of complex systems. A novel definition of Wiener causality based on relative entropy is proposed, and it is argued that any Bregman divergences can be used for detecting causal relations. The discussion includes the benefits of different choices of divergence functions on causal inference and the quality of obtained causal models, with experimental evidence provided on how these causalities improve detection accuracy.
Discovering causal relationships is a fundamental task in investigating the dynamics of complex systems (Pearl in Stat Surv 3:96-146, 2009). Traditional approaches like Granger causality or transfer entropy fail to capture all the interdependence of the statistic moments, which might lead to wrong causal conclusions. In the previous papers (Chen et al. in 25th international conference, ICONIP 2018, Siem Reap, Cambodia, proceedings, Part II, 2018), the authors proposed a novel definition of Wiener causality for measuring the intervene between time series based on relative entropy, providing an integrated description of statistic causal intervene. In this work, we show that relative entropy is a special case of an existing more general divergence estimation. We argue that any Bregman divergences can be used for detecting the causal relations and in theory remedies the information dropout problem. We discuss the benefits of various choices of divergence functions on causal inferring and the quality of the obtained causal models. As a byproduct, we also obtain the robustness analysis and elucidate that RE causality achieves a faster convergence rate among BD causalities. To substantiate our claims, we provide experimental evidence on how BD causalities improve detection accuracy.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.5
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available