## Probability of tail event using Kolmogorov’s 0-1 law

If $X_1,X_2,…$ are independent random variables and $X=\sup_nX_n$ then $P(X<\infty)$ is either 0 or 1. I think that if we prove the event to be a tail event then the result will follow. But I just don’t know how to prove it to be a tail event. Answer Observe that for each $n$, \$\{X<\infty\} … Read more

## probability of 26 letters

A monkey at a typewriter types each if the 26 letters of the alphabet exactly once, the order being random. A. What is the probanility that the word HAMLET appears somewhere in the string if letters? B. How many independent monkey typists would you need in order that the probability that the word appears is … Read more

## Why isn’t D\mathcal D a sigma-algebra?

I came across the statement that if (Ω,F,P) is a probability space and E∈F then D:={A∈F∣A and E are independent} is a Dynkin system. I guess that D is not a sigma-algebra yet I can’t find a counterexample. Thus, we’d need a sequence (An) such that each An is independent from E yet their union isn’t. I’ve tried … Read more

## Is there accepted notation for the pushforward measure that doesn’t mention P\mathbf{P}?

Let (Ω,F,P) denote a probability space, (S,M) denote a measurable space, and X:(Ω,F,P)→(S,M) denote a measurable function (thought of as a random variable). Then there is a pushforward measure induced on (S,M) (thought of as the probability distribution of X), which we could denote X∗(P), following Wikipedia. However, I like to imagine that X “knows” … Read more

## Is convergence in probability sometimes equivalent to almost sure convergence?

I was reading on sufficient and necessary conditions for the strong law of large numbers on this encyclopedia of math page, and I came across the following curious passage: The existence of such examples is not at all obvious at first sight. The reason is that even though, in general, convergence in probability is weaker … Read more

## If B(t)B(t) is Brownian motion then prove W(t)W(t) defined as follows is also Brownian motion

Let B(t) be standard Brownian motion on [0.1] Define W(t) as follows W(t)=B(t)−∫t0B(1)−B(s)1−sds Prove W(t) is also Brownian motion So I’m not sure how to deal with the integral here. In order to show it, too, is Brownian motion I think I would need to Make an argument that the transformation is linear and hence … Read more

## Prove the convergence of a random variable

Given xn∼N(0,1n), is xn almost sure convergence or convergence in probability or convergence in distribution? How to prove it Answer By the Borel–Cantelli lemma, the sequence Xn∼N(0,n−1) converges to 0 almost surely as n→∞ if ∞∑n=1Pr for each \varepsilon>0. Using Markov’s inequality, we can obtain the following bound \Pr\{|X_n|>\varepsilon\}\le\frac{\operatorname E|X_n|^4}{\varepsilon^4}. Since X_n\sim N(0,n^{-1}), we know … Read more

## Show E((X−Y)Y)=0E((X-Y)Y)=0

If EX2<∞ and E(X|G) is F-measurable then E(X|G)=EX There is one step in the proof which I don’t understand, set Y=E(X|G) and then why is E((X−Y)Y)=0, from a theorem I know that Y∈F is such that E(X−Y)2 is minimal Answer E[(X−Y)Y]=E[E((X−Y)Y|G)]=E[YE(X−Y|G)]=E[Y(E(X|G)−E(X|G))]=0. The first equality uses the Law of Iterated Expectations, the second “factoring out what … Read more

## Example of using Delta Method

Let ˆp be the proportion of successes in n independent Bernoulli trials each having probability p of success. (a) Compute the expectation of ˆp(1−ˆp). (b) Compute the approximate mean and variance of ˆp(1−ˆp) using the Delta Method. For part (a), I can calculate the expectation of ˆp but got stuck on the expectation of ˆp2, … Read more

## What is the probability that at least two of the nthn^{\rm th} biggest elements of A\mathbf{A} share the same column?

I have a random matrix A=[aij] for all i,j∈{1,…,n}. Every entry aij of the matrix A is generated randomly with exponential distribution. The aij are i.i.d and have the same parameter λ. Now, for each row i of A, I select the argument of the maximum element. That is, xi=argmax Let X_{ij} be the binary … Read more