4.0 Article

Learning Time-Varying Graphs From Online Data

期刊

IEEE OPEN JOURNAL OF SIGNAL PROCESSING
卷 3, 期 -, 页码 212-228

出版社

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/OJSP.2022.3178901

关键词

Mathematical models; Topology; Covariance matrices; Network topology; Signal processing algorithms; Signal processing; Optimization; Graph topology identification; dynamic graph learning; network topology inference; graph signal processing

向作者/读者索取更多资源

This work proposes an algorithmic framework for learning time-varying graphs from online data, which can be applied to various model-dependent graph learning problems. The framework formulates graph learning as a composite optimization problem, utilizing the empirical covariance matrix to represent data dependence. It incorporates time-varying optimization tools and temporal regularization to improve convergence speed and solution accuracy.
This work proposes an algorithmic framework to learn time-varying graphs from online data. The generality offered by the framework renders it model-independent, i.e., it can be theoretically analyzed in its abstract formulation and then instantiated under a variety of model-dependent graph learning problems. This is possible by phrasing (time-varying) graph learning as a composite optimization problem, where different functions regulate different desiderata, e.g., data fidelity, sparsity or smoothness. Instrumental for the findings is recognizing that the dependence of the majority (if not all) data-driven graph learning algorithms on the data is exerted through the empirical covariance matrix, representing a sufficient statistic for the estimation problem. Its user-defined recursive update enables the framework to work in non-stationary environments, while iterative algorithms building on novel time-varying optimization tools explicitly take into account the temporal dynamics, speeding up convergence and implicitly including a temporal-regularization of the solution. We specialize the framework to three well-known graph learning models, namely, the Gaussian graphical model (GGM), the structural equation model (SEM), and the smoothness-based model (SBM), where we also introduce ad-hoc vectorization schemes for structured matrices (symmetric, hollows, etc.) which are crucial to perform correct gradient computations, other than enabling to work in low-dimensional vector spaces and hence easing storage requirements. After discussing the theoretical guarantees of the proposed framework, we corroborate it with extensive numerical tests in synthetic and real data.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.0
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据