Proving the following inequality (positive def. matrix)

I’m trying to prove (or disprove) the following:

$$N∑i=1N∑j=1cicjKij≥0 \sum_{i=1}^{N} \sum_{j=1}^{N} c_i c_j K_{ij} \geq 0$$
where $$c∈RNc \in \mathbb{R}^N$$, and $$KijK_{ij}$$ is referring to a kernel matrix:

$$Kij=K(xi,xj)=∑Nk=1minK_{ij} = K(x_i,x_j) = \frac{\sum_{k=1}^{N} \min(x_{ik}, x_{jk})}{\sum_{k=1}^{N} \max(x_{ik}, x_{jk})}$$
Here, $$x \in \mathbb{R}^N \geq 0x \in \mathbb{R}^N \geq 0$$.

I’m basically trying to prove that $$K_{ij}K_{ij}$$ is a positive definite matrix, so I can use it as a Kernel, but I’m really stuck trying to work with $$\max\max$$

Edit: the function I’m refering to is:

$$K(u,v) = \frac{\sum_{k=1}^{N} \min(u_{k}, v_{k})}{\sum_{k=1}^{N} \max(u_{k}, v_{k})}K(u,v) = \frac{\sum_{k=1}^{N} \min(u_{k}, v_{k})}{\sum_{k=1}^{N} \max(u_{k}, v_{k})}$$
where $$u, v \in \mathbb{R}^N \geq 0u, v \in \mathbb{R}^N \geq 0$$

Fix $$x_i\in\mathbb{R}^nx_i\in\mathbb{R}^n$$, $$i = 1, 2, \ldots, Ni = 1, 2, \ldots, N$$. We will assume without loss of generality that no $$x_ix_i$$ is identically $$00$$. Define $$N\times NN\times N$$ matrices $$A = (\sum_{k=1}^n\min(x_{i(k)}, x_{j(k)}))A = (\sum_{k=1}^n\min(x_{i(k)}, x_{j(k)}))$$ and $$B = (\sum_{k=1}^n\max(x_{i(k)}, x_{j(k)}))B = (\sum_{k=1}^n\max(x_{i(k)}, x_{j(k)}))$$, where $$x_{(k)}x_{(k)}$$ denotes the $$kk$$th coordinate of $$xx$$. Note that $$K = A\odot B^{\odot-1}K = A\odot B^{\odot-1}$$ where $$\odot\odot$$ denotes the Hadamard product and $$B^{\odot-1}B^{\odot-1}$$ is the Hadamard inverse (entrywise reciprocal) of $$BB$$. By the Schur product theorem, it suffices to show that $$AA$$ and $$B^{\odot-1}B^{\odot-1}$$ are positive definite. We will use the fact that a positive linear combination of positive definite matrices is positive definite.
To show that $$AA$$ is positive definite, note that $$AA$$ can be written as the sum $$A = \sum_{k=1}^n A_kA = \sum_{k=1}^n A_k$$ with $$A_k = \min(x_{i(k)}, x_{j(k)})A_k = \min(x_{i(k)}, x_{j(k)})$$. It suffices to show that e.g. $$A_1A_1$$ is positive definite. For $$i \in [N]i \in [N]$$, let $$y_i = x_{i(1)}y_i = x_{i(1)}$$. By conjugating $$A_1A_1$$ by a permutation matrix, we may assume without loss that $$y_1\leq y_2\ldots \leq y_Ny_1\leq y_2\ldots \leq y_N$$. For $$i \in [N]i \in [N]$$, let $$f_i\in\mathbb{R}^Nf_i\in\mathbb{R}^N$$ denote the vector with $$f_{i(j)} = 0f_{i(j)} = 0$$ for $$j < ij < i$$ and $$f_{i(j)} = 1f_{i(j)} = 1$$ for $$j \geq ij \geq i$$. Then, setting $$y_0 = 0y_0 = 0$$,
$$A_1 = \sum_{i=1}^N(y_i - y_{i-1})f_if_i^t \geq 0. $$A_1 = \sum_{i=1}^N(y_i - y_{i-1})f_if_i^t \geq 0.$$$$
We now show that $$B^{\odot-1}B^{\odot-1}$$ is positive definite. By scaling, we may assume that $$x_{i(j)} \in [0, 1/n]x_{i(j)} \in [0, 1/n]$$ for all $$ii$$ and $$jj$$. Using the identity $$1/x = \sum_{i=0}^{\infty}(1-x)^i1/x = \sum_{i=0}^{\infty}(1-x)^i$$ valid for $$x\in (0, 2)x\in (0, 2)$$, we may write $$B^{\odot-1} = J + \sum_{i=1}^{\infty} (J-B)^{\odot i}B^{\odot-1} = J + \sum_{i=1}^{\infty} (J-B)^{\odot i}$$, where $$J=f_1f_1^tJ=f_1f_1^t$$ denotes the all ones matrix. Now
$$(J-B)_{ij} = 1 - \sum_{k=1}^n\max(x_{i(k)}, x_{j(k)}) = \sum_{k=1}^n \min(\frac{1}{n}-x_{i(k)}, \frac{1}{n}-x_{j(k)}). $$(J-B)_{ij} = 1 - \sum_{k=1}^n\max(x_{i(k)}, x_{j(k)}) = \sum_{k=1}^n \min(\frac{1}{n}-x_{i(k)}, \frac{1}{n}-x_{j(k)}).$$$$ The above argument that showed that $$AA$$ is positive definite now shows that $$J-BJ-B$$ is positive definite (by replacing $$x_ix_i$$ with $$x_i' = \frac{1}{n}f_1 - x_ix_i' = \frac{1}{n}f_1 - x_i$$). Finally, the Schur product theorem and the fact that positive definite matrices are closed under positive linear combinations show that $$B^{\odot-1}B^{\odot-1}$$ is positive definite.