How does scaling rows to sum to 1, of a positive matrix change the perron vector?

Reposting from math.sx due to lack of response. Let A be a N×N positive matrix such that Aij>0. By Perron-Frobenius theorem, there is a unique positive left eigenvector called Perron vector x corresponding to the largest eigenvalue and also it sums to 1. Call it x. Let D be a diagonal matrix such that B=DA … Read more

Spectral radius of the product of a right stochastic matrix and a block diagonal matrix

Let us define the following matrix: $C=AB$ where $B$ is a block diagonal matrix with $N$ blocks, $B_1$, $B_2$ … $B_N$, each of dimensions $M \times M$. I know that $B_k = I_M – \mu R_k$ with $R_k$ equals to a hermitian matrix and $\mu$ some positive constant. Moreover, I know that the the entries … Read more

Efficiently calculate the trace of the product of two large but symmetric matrices, one of which is an inverse

Sorry about the long title. I need to calculate the trace of M(M+D)−1, where M is a dense symmetric matrix, and D is a diagonal matrix. The main issue is the dimension could be large (usually in the hundreds to thousands range) so computing the inverse could be quite expensive especially because the trace needs … Read more

How to enumerate a discrete group of matrices by their Frobenius norm?

Suppose I have a discrete group G<SL2(C), and it is finitely generated by some known generators. That is, G=⟨g1,…,gn⟩. The Frobenius norm of a matrix m=(abcd) is ‖. The set of elements in G having the same Frobenius norm is a discrete subset of a compact set, and therefore is finite. I’m interested in algorithms … Read more

An operator derived from the divided difference operator $\partial_{w_0}$

Some main definitions and basic facts of divided differences: In the symmetric group $S_n$, let $s_i$ denote the adjacent transposition $(i,i+1),i\in \{1,2,\cdots,n-1\}.$ Since $S_n$ is generated by adjacent transpositions, any permutation $w\in S_n$ can be written as $w=s_{i_1}s_{i_2}\cdots s_{i_l}$ and an expression $w=s_{i_1}s_{i_2}\cdots s_{i_l}$ of minimal possible length $l$ is called a reduced decomposition, $l=\ell(w)$ … Read more

non-symmetric weak diagonal-dominant matrix, no decoupling: (a) is positive semi-definite? (b) has dim(ker)=1?

We are considering a matrix A=(aij)i,j=1,…,d∈Rd×d with the following property: aii=−∑j≠iaij, i.e., the matrix is not only weak diagonal-dominant, but its rows also sum up to 0. Note that the matrix is not necessarily symmetric (otherwise positive semi-definiteness would follow immediately from the weak diagonal-dominance). Furthermore, the matrix has the property that there are no … Read more

Fast matrix-vector product for structured matrices

Let $X\in\mathbb{C}^{m\times n}$ be a matrix that satisfies the Sylvester equation $$AX-XB = F,\qquad A\in\mathbb{C}^{m\times m}, \quad B\in\mathbb{C}^{n\times n},$$ where $F\in\mathbb{C}^{m\times n}$ is of low rank. In special cases as shown in http://math.mit.edu/~plamen/files/mds.pdf there are known algorithms for a fast matrix-vector product, i.e., computing $X\underline{v}$ in quasilinear complexity. These special cases include Toeplitz-like, Hankel-like, Cauchy-like, … Read more

Matrices with almost constant coefficient have a simple eigenvalue

As a by-product of a general result for bounded operators of a Banach space, I have the following: A matrix $L=(\ell_{ij})_{ij}$ that has almost constant coefficients in the sense that for some $c$, on all row $i$ it holds \begin{equation} \frac{1}{n}\sum_k \lvert \ell_{ik}-c\rvert\le \frac{|c|}{6}, \label{eq:Perron} \end{equation} must have a simple eigenvalue. Under a slightly stronger … Read more

Matrices in SL(2,C)SL(2,\mathbb{C}) with characteristic polynomial defined over a subring

Let R⊂C be a subring, and let A,B∈SL(2,C) be matrices such that A,B,AB all have trace in R. For which R can we then deduce that A,B are simultaneously conjugate to a pair of matrices in SL(2,R)? In particular I’m wondering about R=Z,Q, or a general number field. (Each of A,B is conjugate to a … Read more

A sum of Ramanujan sums

I have the following question about Ramanujan sums. (All vectors and matrices here will be understood to have integer entries.) Let Xq={(x1,…,xR)|1≤xi≤q} and let, for any R×R matrix B with rows bi, cq(Bx):=cq(b1⋅x)…cq(br⋅x), where cq(n) is Ramanujan’s sum. Suppose an R×R matrix A with determinant D is given and denote by I the identity matrix. … Read more