# Does rate of convergence in probability come from a metric?

In general, when we talk about convergence of a sequence, we need a topological space. If we want to talk about a rate of convergence, we need to quantify how far away one element of the sequence is from the limit point. This can be done in a metric space as follows:

Definition 1 (Rate of Convergence):
Let $(M,d)$ be a metric space, $(x_n)_{n\in\mathbb N} \subset M$ a sequence that converges to $x\in M$, and $(r_n)_{n\in\mathbb N} \subset (0,\infty)$ a sequence with $r_n\to0$. We say, the sequence $(x_n)_{n\in\mathbb N}$ converges with rate $(r_n)_{n\in\mathbb N}$ iff $\limsup_{n\to\infty} r_n^{-1} d(x_n,x) < \infty$.

For random variables, there are different kinds of notions of convergence (topologies), e.g., convergence in probability and $L_p$-convergence. The latter topology is induced by the $L_p$-Norm (for $L_2$, we even have an inner product). As far as I know, there is no normed space that induces convergence in probability, but there are several metrics that induce this kind of convergence:
For random variables $X,Y$ with values in $\mathbb R$ define

1. $d_1(X,Y) := \mathbb E\left[\min(|X-Y|,1)\right]$
2. $d_2(X,Y) := \mathbb E\left[\frac{|X-Y|}{|X-Y|+1}\right]$
3. $d_3(X,Y) := \inf\left\lbrace\epsilon>0: \mathbb P(|X-Y|>\epsilon)<\epsilon\right\rbrace$ (Ky Fan metric)

All three are metrics that induce convergence in probability. Next, we state the usual definition for rate of convergence in probability (see, e.g., https://en.wikipedia.org/wiki/Big_O_in_probability_notation).

Definition 2 (Rate of Convergence in Probability):
Let $(X_n)_{n\in\mathbb N}$ be a sequence of $\mathbb R$-valued random variables that converges to $X$ in probability. Let $(r_n)_{n\in\mathbb N} \subset (0,\infty)$ a sequence with $r_n\to0$. We say, the sequence $(X_n)_{n\in\mathbb N}$ converges in probability with rate $(r_n)_{n\in\mathbb N}$, often denoted as $|X_n-X|\in O_p(r_n)$, iff for all $\epsilon>0$ there is $K_\epsilon>0$ such that $\mathbb P(r_n^{-1}|X_n-X|> K_\epsilon)<\epsilon$ for all $n \in \mathbb N$.

Now I can state my question (it has two parts):

1. Is there a metric on the space of real-valued random variables that induces convergence in probability and induces (by Definition 1) the rate of convergence in probability (Definition 2)?
2. Why do we usually prove rates in the sense of Definition 2? What makes this definition appropriate when taking about rate of convergence in probability? Why do we not use Definition 1 with, e.g., $d_1$, $d_2$, or $d_3$?