4.5 Article

Compressed and Privacy-Sensitive Sparse Regression

期刊

IEEE TRANSACTIONS ON INFORMATION THEORY
卷 55, 期 2, 页码 846-866

出版社

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TIT.2008.2009605

关键词

Capacity of multiple-antenna channels; compressed sensing; high-dimensional regression; lasso; l(1) regularization; privacy; sparsity

资金

  1. National Science Foundation [CCF-0625879]

向作者/读者索取更多资源

Recent research has studied the role of sparsity in high-dimensional regression and signal reconstruction, establishing theoretical limits for recovering sparse models. This line of work shows that l(1)-regularized least squares regression can accurately estimate a sparse linear model from noisy examples in high dimensions. We study a variant of this problem where the original n input variables are compressed by a random linear transformation to m << n examples in p dimensions, and establish conditions under which a sparse linear model can be successfully recovered from the compressed data. A primary motivation for this compression procedure is to anonymize the data and preserve privacy by revealing little information about the original data. We characterize the number of projections that are required for l(1)-regularized compressed regression to identify the nonzero coefficients in the true model with probability approaching one, a property called sparsistence. We also show that l(1)-regularized compressed regression asymptotically predicts as well as an oracle linear model, a property called persistence. Finally, we characterize the privacy properties of the compression procedure, establishing upper bounds on the mutual information between the compressed and uncompressed data that decay to zero.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.5
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据