About one year ago Gilbert Strang published a first paper Fast transforms: Banded matrices with banded inverses at Proceedings of the National Academy of Science (PNAS), with the following abstract:

Banded matrices (or band matrices) are somewhat related to implementations of discrete wavelet transforms (see for instance Short Wavelets and Matrix Dilation Equations, 1995 by G. Strang and V. Strela). The 2010 PNAS sparked some interest as a means to implement fast local linear filters in both the forward and inverse transforms (an

The chronicle can been read at Dr. Dobbs and here in pdf as well. And News I care mentioned it briefly in Fast signal transform with Banded matrix method.

The blog of a

Fast transforms: Banded matrices with banded inverses, Proc. National Academy of Sciences 107 (#28) (2010) 12413-12416.

Banded Matrices with Banded Inverses and A = LPU, Proceedings Intl. Congress of Chinese Mathematicians: ICCM2010, to appear.

Groups of banded matrices with banded inverses, Proc. AMS, to appear (2011)

Triangular factorizations: The algebra of elimination, submitted to SIAM Review (2011)

Additional papers:

Factoring Permutation Matrices Into a Product of Tridiagonal Matrices (pdf) Michael Daniel Samson, Martianus Frederic Ezerman (July 2010)

Factorization Of Banded Permutations (pdf), Greta Panova (July 2010)

New words learned by myself today:

acerbated: embittered, resentful

berm: (1) a narrow shelf, path, or ledge typically at the top or bottom of a slope; also : a mound or wall of earth or sand (2) the shoulder of a road

distend: become wider, cause to expand as it by internal pressure, swell from or as if from internal pressure

It is unusual for both $A$ and $A^{-1}$ to be banded — but this can be a valuable property in applications. Block-diagonal matrices $F$ are the simplest examples; wavelet transforms are more subtle. We show that every example can be factored into $A = F_1 \dots F_N$ where $N$ is controlled by the bandwidths of $A$ and $A^{-1}$ (but not by their size, so this extends to infinite matrices and leads to new matrix groups).

Banded matrices (or band matrices) are somewhat related to implementations of discrete wavelet transforms (see for instance Short Wavelets and Matrix Dilation Equations, 1995 by G. Strang and V. Strela). The 2010 PNAS sparked some interest as a means to implement fast local linear filters in both the forward and inverse transforms (an

**ascribed tandem**, first anagram to**banded matrices**), by splitting matrices into ones faster to compute. In Unraveling the Matrix (A new way of analyzing grids of numbers known as matrices could improve signal-processing applications and data-compression schemes) at MIT News, Larry Hardesty states that:In a paper published in the July 13 issue of Proceedings of the National Academy of Science, MIT math professor Gilbert Strang describes a new way to split certain types of matrices into simpler matrices. The resultcould have implications for software that processes video or audio data, for compression software that squeezes down digital filesso that they take up less space, or even for systems that control mechanical devices.

[...]

Richard Brualdi, the emeritus UWF Beckwith Bascom Professor of Mathematics at the University of Wisconsin-Madison, points out that a mathematical conjecture that Strang presents in the paper has already been proven by three other groups of researchers. “It’s a very interesting theorem,” says Brualdi. “It’s already generated a couple of papers, and it’ll probably generate some more.” Brualdi points out that large data sets, such as those generated by gene sequencing, medical imaging, or weather monitoring, often yield matrices with regular structures. Bandedness is one type of structure, but there are others, and Brualdi expects other mathematicians to apply techniques like Strang’s to other types of structured matrices. “Whether or not those things will work, I really don’t know,” Brualdi says. “But Gil’s already said that he’s going to look at a different structure in a future paper.”

The chronicle can been read at Dr. Dobbs and here in pdf as well. And News I care mentioned it briefly in Fast signal transform with Banded matrix method.

The blog of a

**mandated scribe**(on mixed**banded matrices**) can be used as a note to self. I completely forgot to read that paper, which was openly available at the time. Thanks to an update at Gilbert Strang's pdf paper repository, a few related others are now shared in pdf:Fast transforms: Banded matrices with banded inverses, Proc. National Academy of Sciences 107 (#28) (2010) 12413-12416.

Abstract above

Banded Matrices with Banded Inverses and A = LPU, Proceedings Intl. Congress of Chinese Mathematicians: ICCM2010, to appear.

If $A$ is a banded matrix with a banded inverse, then $A = BC = F_1 \dots F_N$ is a product of block-diagonal matrices. We review this factorization, in which the $F_i$ are tridiagonal and $N$ is independent of the matrix size. For a permutation with bandwidth $w$, each $F_i$ exchanges disjoint pairs of neighbors and $N < 2w$. This paper begins the extension to infinite matrices. For doubly infinite permutations, the factors $F$ now include the left and right shift. For banded infinite matrices, we discuss the triangular factorization $A = LPU$ (completed in a later paper on The Algebra of Elimination). Four directions for elimination give four factorizations $LPU$ and $UPL$ and $U_1 \pi U_2$ (Bruhat) and $L_1 \pi L_2$ with different $L$, $U$, $P$ and $\pi$.

Groups of banded matrices with banded inverses, Proc. AMS, to appear (2011)

A product $A = F_1 \dots F_N$ of invertible block-diagonalmatrices will be bandedwith a banded inverse. We establish this factorization with the number $N$ controlled by the bandwidths w and not by the matrix size n: When A is an orthogonal matrix, or a permutation, or banded plus finite rank, the factors $F_i$ have $w = 1$ and generate that corresponding group. In the case of infinite matrices, conjectures remain open.

Triangular factorizations: The algebra of elimination, submitted to SIAM Review (2011)

Elimination with only the necessary row exchanges will produce the triangular factorization $A = LPU$, with the (unique) permutation $P$ in the middle. The entries in $L$ are reordered in comparison with the more familiar $A = PLU$ (where $P$ is not unique). Elimination with three other starting points 1, $n$ and $n$, $n$ and $n$, 1 produces three more factorizations of $A$, including the Wiener-Hopf form $UPL$ and Bruhat’s $U_1 \pi U_2$ with two upper triangular factors.Only for anagram sake, these approaches may

All these starting points are useless for doubly infinite matrices. The matrix has no first or last entry. When A is banded and invertible, we look for a new way to establish $A = LPU$. First we locate the pivot rows (and the main diagonal of $A$). $LPU$ connects to the classical factorization of matrix polynomials developed for the periodic (block Toeplitz) case when $A(i,j) = A(i+b;j+b)$.

**reanimate BDDCs**(balancing domain decomposition by constraints), which may sound a**macabre distend**to**acerbated minds**walking on**candidate berms**.Additional papers:

Factoring Permutation Matrices Into a Product of Tridiagonal Matrices (pdf) Michael Daniel Samson, Martianus Frederic Ezerman (July 2010)

Gilbert Strang posited that a permutation matrix of bandwidth $w$ can be written as a product of $N < 2w$ permutation matrices of bandwidth 1. A proof employing a greedy ``parallel bubblesort'' algorithm on the rows of the permutation matrix is detailed and further points of interest are elaborated.

Factorization Of Banded Permutations (pdf), Greta Panova (July 2010)

We prove a conjecture of Gilbert Strang stating that a banded permutation of bandwidth $w$ can be represented as a product of at most $2w-1$ permutations of bandwidth 1.Approximating the inverse of banded matrices by banded matrices with applications to probability and statistics (pdf), Peter J. Bickel, Marko Lindner (February 2010)

In the first part of this paper we give an elementary proof of the fact that if an infinite matrix $A$, which is invertible as a bounded operator on $\ell^2$, can be uniformly approximated by banded matrices then so can the inverse of $A$. We give explicit formulas for the banded approximations of $A^{-1}$ as well as bounds on their accuracy and speed of convergence in terms of their band-width. In the second part we apply these results to covariance matrices $\Sigma$ of Gaussian processes and study mixing and beta mixing of processes in terms of properties of $\Sigma$. Finally, we note some applications of our results to statistics.

New words learned by myself today:

acerbated: embittered, resentful

berm: (1) a narrow shelf, path, or ledge typically at the top or bottom of a slope; also : a mound or wall of earth or sand (2) the shoulder of a road

distend: become wider, cause to expand as it by internal pressure, swell from or as if from internal pressure

This is interesting. If you know A and want to know A^-1 and know that both of them are sparse (banded here), then one might want to apply CS to this problem. Laurent Desmanet has done some work on that and I wonder how his bounds are different if one knows in advance that A^-1 is itself sparse.

ReplyDeleteDear Igor,

ReplyDeleteThank you for your comment. I am looking forward to reading such results. How do you think CS could be helpful with products of (banded) matrices? Other structured matrix shapes seem to be under investation, some authors are concerned with binary banded matrices as well. Let's wait for more interesting results.