Proving the following inequality (positive def. matrix)

I’m trying to prove (or disprove) the following:

where cRN, and Kij is referring to a kernel matrix:

Here, x \in \mathbb{R}^N \geq 0.

I’m basically trying to prove that K_{ij} is a positive definite matrix, so I can use it as a Kernel, but I’m really stuck trying to work with \max

Edit: the function I’m refering to is:

K(u,v) = \frac{\sum_{k=1}^{N} \min(u_{k}, v_{k})}{\sum_{k=1}^{N} \max(u_{k}, v_{k})}
where u, v \in \mathbb{R}^N \geq 0


Fix x_i\in\mathbb{R}^n, i = 1, 2, \ldots, N. We will assume without loss of generality that no x_i is identically 0. Define N\times N matrices A = (\sum_{k=1}^n\min(x_{i(k)}, x_{j(k)})) and B = (\sum_{k=1}^n\max(x_{i(k)}, x_{j(k)})), where x_{(k)} denotes the kth coordinate of x. Note that K = A\odot B^{\odot-1} where \odot denotes the Hadamard product and B^{\odot-1} is the Hadamard inverse (entrywise reciprocal) of B. By the Schur product theorem, it suffices to show that A and B^{\odot-1} are positive definite. We will use the fact that a positive linear combination of positive definite matrices is positive definite.

To show that A is positive definite, note that A can be written as the sum A = \sum_{k=1}^n A_k with A_k = \min(x_{i(k)}, x_{j(k)}). It suffices to show that e.g. A_1 is positive definite. For i \in [N], let y_i = x_{i(1)}. By conjugating A_1 by a permutation matrix, we may assume without loss that y_1\leq y_2\ldots \leq y_N. For i \in [N], let f_i\in\mathbb{R}^N denote the vector with f_{i(j)} = 0 for j < i and f_{i(j)} = 1 for j \geq i. Then, setting y_0 = 0,
\begin{equation}A_1 = \sum_{i=1}^N(y_i - y_{i-1})f_if_i^t \geq 0.

We now show that B^{\odot-1} is positive definite. By scaling, we may assume that x_{i(j)} \in [0, 1/n] for all i and j. Using the identity 1/x = \sum_{i=0}^{\infty}(1-x)^i valid for x\in (0, 2), we may write B^{\odot-1} = J + \sum_{i=1}^{\infty} (J-B)^{\odot i}, where J=f_1f_1^t denotes the all ones matrix. Now
\begin{equation}(J-B)_{ij} = 1 - \sum_{k=1}^n\max(x_{i(k)}, x_{j(k)}) = \sum_{k=1}^n \min(\frac{1}{n}-x_{i(k)}, \frac{1}{n}-x_{j(k)}).
The above argument that showed that A is positive definite now shows that J-B is positive definite (by replacing x_i with x_i' = \frac{1}{n}f_1 - x_i). Finally, the Schur product theorem and the fact that positive definite matrices are closed under positive linear combinations show that B^{\odot-1} is positive definite.

Source : Link , Question Author : Norhther , Answer Author : Anand

Leave a Comment