Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Note that the points are no longer independent of each other, hence they are no longer uniformly distributed.

This is false. Random variables can be uniform even if they are not independent.

For example, let X be uniform(0, 1), and let Y = X + 0.1 mod 1. Clearly they are not independent. Clearly X is uniform. When you realize that you could redefine the variables as Y = uniform(0, 1) and X = Y - 0.1 mod 1, then it becomes obvious that Y is also uniform.

It is a common problem to know a variable's distribution without knowing whether it is independent.



Good catch. Probably what the article meant to say is that the lack of independence means the points are not conditionally uniform. That is, although X and Y follow uniform distributions individually, the dependence between them means that once you observe X, your resulting belief about Y is no longer a uniform distribution. Compare this to the independent case where knowing X would tell you nothing about Y, so your belief about Y conditioned on X would still just be uniform.


You could use a much simpler example:

Let X be uniform(0,1) and let Y=X. Clearly the are not independent. Clearly X is uniform. And it's obvious that Y is also uniform.

The comment you quote means "the points are no longer independent of each other, hence each point is no longer uniformly distributed given the others". In you example, Y doesn't have a uniform distribution for X fixed.


I noticed this too. Using the contrapositive, he is arguing that all uniformly distributed variables are independent!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: