Can absolute scalability be ‘relaxed’ to an equivalent condition in the properties of a norm?

All norms on a vector space V must satisfy for any x∈V ‖ for any scalar \alpha\in R. However, I’ve been told that an equivalent condition is \Vert \alpha x \Vert \leq \vert \alpha \vert \Vert x \Vert . Is this true, or is there a counterexample? Answer The relaxed condition also implies \left\|\frac1\alpha \alpha … Read more

Proving an induced operator norm equality:

The induced matrix norm is defined by ||A||=supx≠0||Ax||||x|| Show that ||A|| = \sup_{||x||=1} ||Ax|| A is only assumed to be square — not anything more, e.g., not symmetric / orthogonal, etc. I’ve been working on this problem for awhile and would welcome any hints. I’ve tried using the definition of vector norms, namely ||x|| = … Read more

Find norm of operator L(x,y)=(x+3y,y−x)L(x,y)=(x+3y,y-x)

I’m trying to tackle the following question, but with no success… Let L: R2→R2 be an operator such that L(x,y)=(x+3y,y−x). Find ‖. So, I know that I need to find \displaystyle \sup\left(\frac{\|Lx\|}{\|x\|}\right). I have calculated the norms and found the gradient. I got that the sup is 1+\sqrt{5}, but I’m not sure it’s correct and I’m … Read more

Cannot understand how angle between two vectors is calculated

On the picture below I am not getting why we calculate $\cos^{-1}(\frac{1}{3})$ instead of $\cos(\frac{1}{3})$. Sorry if the question is dumb. Answer You have that $\cos(\theta)=\frac{1}{3}$ However, you are not looking for $\cos(\theta)$, you are looking for $\theta$ itself, so that is, $\arccos(\frac{1}{3})=\theta$ AttributionSource : Link , Question Author : YohanRoth , Answer Author : … Read more

Comparing matrix norm with the norm of the inverse matrix

I need help understanding and solving this problem. Prove or give a counterexample: If A is a nonsingular matrix, then ‖A−1‖=‖A‖−1 Is this just asking me to get the magnitude of the inverse of Matrix A, and then compare it with the inverse of the magnitude of Matrix A? Answer If A is nonsingular, then … Read more

Is there a nuclear norm approximation for stochastic gradient descent optimization?

I want to minimize $E$ by using stochastic gradient descent. I know that there is a sub-differential for the nuclear norm, but i want to know if is there a approximation of nuclear norm in order to use stochastic gradient descent (s.g.d.) directly?. The cost function is defined as below: $$E(\Omega) = f(\Omega) + \lambda … Read more

Show norm preserving property and determine Eigenvalues

Can someone of you give me a solution for this? Let N∈N. a) We define the map F:(CN,||⋅||2)→(CN,||⋅||2) by (F(x))k:=1√NN∑j=1xje2πi(j−1)(k−1)N∀k∈{1,…,N} Show that F is norm-preserving, i.e. ||(F(x))k||2=||x||2∀k∈{1,…,N} b) For n,m∈{1,2,…,N} we define the entries of M∈CN×N by Mnm:={N+12,n=m1e2πim−nN−1,n≠m Show that 1,2,3,…,N are the Eigenvalues of M. Answer a), Like Omnomnomnom’s opinion, the DFT matrix shows … Read more