Let X = sum x_i, Y = sum y_i (over all i)

where ∀i : x_i, y_i, X, Y are random variables

If ∀i : E[x_i] > E[y_i], we can say E[X] > E[Y]

This is by linearity of expectation.

I'm looking for a theorem with a similar "structure" which I can use to show this:

If ∀i,k : P(x_i > k) > P(y_i > k), then ∀k : P(X > k) > P(Y > k)

I first had the idea of trying to showing this with linearity of expectation before realizing it didn't quite fit, but there still seems to be some structure running through them both.

Not sure if this is a meaningful question, but can we say anything in general about functions over random variables which fit this criteria? :

∀i : f(x_i) > f(y_i) => f(X) > f(Y)

This question is inspired by the biased coin example from wikipedia: https://en.wikipedia.org/wiki/Coupling_(probability)#Biased_coins

## looking for a probability theorem

**Moderators:** gmalivuk, Moderators General, Prelates

### Re: looking for a probability theorem

I don't know of a general theorem, but you do need to state that the individual x_i are independent, and the individual y_i are independent. Otherwise there are counterexamples.

### Re: looking for a probability theorem

Can you give a counter example? I can't seem to think of any off the top of my head.

### Re: looking for a probability theorem

Let x_1 and x_2 be independent standard normal random variables (mean 0, variance 1).

Let y_1 be another normal random variable, with mean -0.1, variance 1, and y_2 = y_1.

Then for all i,k, P(x_i > k) > P(y_i > k).

But X = x_1+x_2 is a normal random variable with mean 0, variance 2, while Y = y_1+y_2 = 2*y_1 is a normal random variable with mean -0.2, variance 4. Go sufficiently far to the right, and the greater variance of Y swamps the slightly greater mean of X. For example, with k=2, P(X > 2) = 0.0786. P(Y > 2) = 0.1357

Let y_1 be another normal random variable, with mean -0.1, variance 1, and y_2 = y_1.

Then for all i,k, P(x_i > k) > P(y_i > k).

But X = x_1+x_2 is a normal random variable with mean 0, variance 2, while Y = y_1+y_2 = 2*y_1 is a normal random variable with mean -0.2, variance 4. Go sufficiently far to the right, and the greater variance of Y swamps the slightly greater mean of X. For example, with k=2, P(X > 2) = 0.0786. P(Y > 2) = 0.1357

### Who is online

Users browsing this forum: No registered users and 6 guests