## A sum of Ramanujan sums

I have the following question about Ramanujan sums. (All vectors and matrices here will be understood to have integer entries.) Let Xq={(x1,…,xR)|1≤xi≤q} and let, for any R×R matrix B with rows bi, cq(Bx):=cq(b1⋅x)…cq(br⋅x), where cq(n) is Ramanujan’s sum. Suppose an R×R matrix A with determinant D is given and denote by I the identity matrix. … Read more

## An upper bound on the Jordan condition number of a matrix

The Jordan condition number of a matrix $A$ is defined to be $\min_{V}\kappa(V)$, where $V$ ranges over complex matrices that satisfy $A = VJV^{-1}$ for $J$ being the unique Jordan normal form matrix of $A$. ($\kappa$ is the condition number, i.e., $\kappa(V) = \|V\|\|V^{-1}\|$). Clearly, if $A$ is normal then the Jordan condition number of … Read more

## Diagonalization of real symmetric matrices with symplectic matrices

Consider the following real symmetric matrix M=[ABBTD] Both A and D are real symmetric n×n matrices. B is a real n×n matrix but not necessarily symmetric. I am interested in diagonalizing the matrix with a symplectic matrix R satisfying RTJR=J where J=[01−10] such that RTMR is block-diagonal. In particular, I am interested in an algorithm that … Read more

## How to compute an indefinite generalisation of QR decomposition

Given an arbitrary complex matrix M and real, diagonal but possibly indefinite matrix Δ, the problem is to solve the following system of equations: M∗ΔM=LD2L∗M=QDL∗ for lower unitriangular L, diagonal D, and Q. The running time should be measured in terms of the number of multiplications. Notice that when Δ is the identity matrix, this … Read more

## Is it hard to decide whether a matrix is a square of another matrix?

According to the well-know quadratic residue (QR) theory over integers, we know that it is hard to decide whether a given integer m∈ZN is a quadratic residue (i.e., a square of another integer x∈ZN), without knowing the factorization of N. Now, my question is: Without knowing the factorization of N, is it hard to decide … Read more

## LU decomposition for orthogonal or unitary matrices?

Is there any references on LU decomposition for orthogonal or unitary matrices? It seems to me that the diagonal entries of $U$ has some nice structure regarding to the Euler angles of the original matrix. As one can easily see under a Euler parametrisation: $$\begin{bmatrix}\cos\theta&\sin\theta\\-\sin\theta&\cos\theta\end{bmatrix}=\begin{bmatrix}1&0\\\tan\theta&1\end{bmatrix}\begin{bmatrix}\cos\theta&\sin\theta\\0&1/\cos\theta\end{bmatrix}.$$ And for the $3\times 3$ case, the diagonal entries for … Read more

## Calculating the dimension of the algebra generated by some given matrices

Let X1,X2,…,Xd be n×n matrices over some field K. I want to calculate the dimension of the unital algebra generated by X1,X2,…,Xd for some examples in a problem I am working on. If d=1, we only have one X, and the dimension is equal to the degree of the minimum polynomial of X. For d>1, … Read more

## Does RR is dedekind-finite imply Mn(R)\mathbb{M}_n(R) is dedekind-finite

Following Lam’s notation, a ring (with identity) R is called dedekind-finite if ab=1⟺ba=1 in R. There are a lot of result about left invertible implies right invertible. But the results all require some finiteness property on the ring or the matrix ring. I am asking a proof or a couterexample of that that R is … Read more

## A Handbook of Matrix Factorizations

I am looking for a good collection of facts regarding the various types of matrix factorizations, something like a “Handbook of Matrix Factorizations” or a very-thorough review paper. I am hoping for a cohesive collection which talks about many different types of matrix factorizations, under which conditions can they be done, their properties, and their … Read more

## Generalization of Jordan’s Lemma A2=B2=IA^2=B^2=I can be 2-block diagonalized

One of Jordan’s lemma states that if two orthogonal matrices A,B are such that A2=B2=I, then they can be co-diagonalized by block of size 2. (the proof is easy, consider x an eigenvector of A+B, y=Ax, show that V=Vect(x,y) is stable under A,B and repeat this over the orthogonal of V). Can this be generalized? … Read more