Journal
IMA JOURNAL OF NUMERICAL ANALYSIS
Volume 33, Issue 3, Pages 1029-1047Publisher
OXFORD UNIV PRESS
DOI: 10.1093/imanum/drs019
Keywords
matrix balancing; Sinkhorn-Knopp algorithm; doubly stochastic matrix; conjugate gradient iteration
Categories
Funding
- Carnegie Trust for the Universities of Scotland
Ask authors/readers for more resources
As long as a square non-negative matrix A has total support then it can be balanced, that is, we can find a diagonal scaling of A that has row and column sums equal to one. A number of algorithms have been proposed to achieve the balancing, the most well known of these being Sinkhorn-Knopp. In this paper, we derive new algorithms based on inner-outer iteration schemes. We show that Sinkhorn-Knopp belongs to this family, but other members can converge much more quickly. In particular, we show that while stationary iterative methods offer little or no improvement in many cases, a scheme using a preconditioned conjugate gradient method as the inner iteration converges at much lower cost (in terms of matrix-vector products) for a broad range of matrices; and succeeds in cases where the Sinkhorn-Knopp algorithm fails.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available