- UID
- 223457
- 帖子
- 298
- 主题
- 1
- 注册时间
- 2011-7-11
- 最后登录
- 2014-8-7
|
10#
发表于 2011-7-23 14:49
| 只看该作者
Variance bound is [E(|x|)^2 – E(x)^2, inf).
Regarding the variance bounds for this type of problem, the following is true regardless of symmetry.
Var(|x|) = E(|x|^2) – E(|x|)^2 = E(x^2) – E(|x|)^2
Var(x) = E(x^2) – E(x)^2
So, Var(|x|) + E(|x|)^2 = Var(x) + E(x)^2
In this case, E(|x|) = 1 and E(x) = 0. So,
Var(|x|) + 1 = Var(x)
Since we know Var(|x|) >= 0 then the minimum value of Var(x) is 0 + 1 = 1. Seemorr already produced a distribution where Var(|x|) = 0 is possible. (Generally the minimum is E(|x|)^2 – E(x)^2, note that E(|x|)^2 >= E(x)^2)
So what is the max….well, assume a case where the distribution of |x| is lognormal. It should be clear that when we set E(|x|) = 1 (or any constant), Var(|x|) is unbounded above.
This is far from mathematical proof but it should also be obvious from above that combinations of seemorr’s distribution and a lognormal distribution for |x| should be able to produce a set where E(|x|) = 1, E(x) = 0, and Var(x) is equal to whatever we want. |
|