Does a non-exchangeable empirical reverse-martingale exist?

Consider a possible finite sequence ξ1,ξ2,… of random variables and consider the measure-valued empirical process ηn=∑ni=1δξin,n=1,2,… Assume ηn is a reverse martingale, in the sense that (∫fdηn) is a reverse-martingale for every f≥0. Does it automatically hold that (ηn) is exchangeable? I.e., does it hold that for any f1,…,fn and permutation p of {1,…,n} E(∫f1η1⋯∫fnηn)=E(∫f1ηp(1)⋯∫fnηp(n))? … Read more

Sum of Binomial random variable CDF

Suppose there are two independent Binomial random variables X∼Binomial(n,p)Y∼Binomial(n,p+δ) where δ is considered to be fixed, and p can vary in (0,1−δ). Now consider the following value (sum of two probabilities): P(X≥(p+δ2)⋅n)+P(Y≤(p+δ2)⋅n) My question is: Which value of p maximizes the above quantity? My intuition and simulation both says that when p+δ2=1/2, the value takes … Read more

What is the entropy of binomial decay?

Let’s play a game. I start with $N$ indistinguishable tokens, and I wait $T$ turns. Every turn, each token has probability $p$ of disappearing. I want an analytic formula for the entropy of this process, as a function of $N$, $T$, and $p$. The calculation is straightforward for $N=1$ and $T=\infty$. The probability $p_i$ that … Read more

Minimization over a convex function of equal vs unequal success probabilities of Bernoulli random variables

Let $U_1,U_2,\ldots,U_n$ be $n\geq 2$ mutually independent Bernoulli random variables. There are two cases of interest: $1.$ The random variables $U_1,U_2,\ldots,U_n$ are identically distributed; $U_1\sim \operatorname{Bernoulli}(p)$ with probability $1/2$. Let $P_0$ be the associated probability measure, i.e. $$P_0(U_1,\ldots,U_n)=P(U_1,\ldots,U_n\mid U_1\sim \operatorname{Bernoulli}(p))$$ $U_1\sim \operatorname{Bernoulli}(1-q)$ with probability $1/2$. Let $P_1$ be the associated probability measure, i.e. $$P_1(U_1,\ldots,U_n)=P(U_1,\ldots,U_n\mid U_1\sim … Read more

Asymptotics of the joint pdf of two sums of powers of independent U(0,1)\mathcal U(0,1) random variables

As a warm-up in words: The sum of twelve uniform random variables is a classic approximation to a normal distribution. What is the joint pdf for the sum of their cubes and the sum of their fourth powers? This is the d=12, p=3 case of the asymptotic question that follows. Let fd=fp;d be the joint … Read more

A lower bound on the expected sum of Bernoulli random variables given a constraint on its distribution

Given a set of Bernoulli random variables $x_1, \dots, x_n$ (not necessarily identical) with $X= \sum_{0<i\leq n} x_i$, I am intrested in finding a lower-bound for $\frac{\mathbb{E} [ \min (X,k) ]}{\mathbb{E} [X]}$ in terms of $k$ and $\alpha$ where $\alpha > \Pr[X>k]$. For example, I want to show that this ratio is a large enough … Read more

Max / Argmax of a function which includes sums of Gaussian CDFs and PDFs can surprisingly be approximated by a power law

Given N\in\mathbb{N}, I have been trying to calculate m_N=\text{max}_{x\in\mathbb{R}^{+}}\chi_N(x), d_N=\text{argmax}_{x\in\mathbb{R}^{+}}\chi_N(x) for the function: \chi_N(x)=\frac{\sum_{i\in K_N}\sum_{j\in K_N}\phi(ix)\phi(jx)}{\sum_{i\in K_N}\sum_{j\in K_N}\Phi\left(-\max\left(i,j\right)x\right)\Phi\left(\min\left(i,j\right)x\right)} Where K_N=\left\{ k-\frac{N}{2}|\forall k\in\mathcal{\mathbb{N}},k<N\right\}, and \phi,\Phi are the gaussian PDF and CDF. Solving those equations numerically over values of N in the range [3..1024], it appears that the following expressions, which I got using linear-regression over the … Read more

Strange inequality relating Binomial pmf and cdf

I’m encountering a strange inequality I need to prove, relating the Binomial pmf and cdf. Suppose we have n coin flips, and fix an arbitrary k≤n/2 heads. Suppose further that we have some small constant δ (say less than 1/3, though I suspect the result would be true for δ as big as 1/2). Let … Read more

A possible generalization of Solomonoff’s theorem

Assume that P and Q are probability distribution on the binary tree, i.e. P and Q are functions {0,1}∗→R such that: for every x: P(x)=P(x0)+P(x1) and P(empty word)=1. Theorem(Solomonoff) Assume that for some c>0 it holds that for every x Q(x)≥cP(x) . Then: ∑x∈{0,1}∗P(x)(P(0|x)−Q(0|x))2=O(−logc). (the same is true for (P(1|x)−Q(1|x))2 since it is equal to … Read more

Moments of the prime counting function given the moments of the second Chebyshev function

I have read this article (Montgomery and Soundararajan: Primes in short intervals. http://arxiv.org/abs/math/0409258 ). In the second page of the article, it is stated that the mean and variance of \psi(n+h)-\psi(n), in a given range of h and n, tends to h and h \log\frac{h}{n} respectively, being \psi(n) the second Chebyshev function. On the other … Read more