From Surf Wiki (app.surf) — the open knowledge base
Uncorrelatedness (probability theory)
Concept in probability theory
Concept in probability theory
In probability theory and statistics, two real-valued random variables, X, Y, are said to be uncorrelated if their covariance, \operatorname{cov}[X,Y] = \operatorname{E}[XY] - \operatorname{E}[X] \operatorname{E}[Y], is zero. If two variables are uncorrelated, there is no linear relationship between them.
Uncorrelated random variables have a Pearson correlation coefficient, when it exists, of zero, except in the trivial case when either variable has zero variance (is a constant). In this case the correlation is undefined.
In general, uncorrelatedness is not the same as orthogonality, except in the special case where at least one of the two random variables has an expected value of 0. In this case, the covariance is the expectation of the product, and X and Y are uncorrelated if and only if \operatorname{E}[XY] = 0.
If X and Y are independent, with finite second moments, then they are uncorrelated. However, not all uncorrelated variables are independent.
Definition
Definition for two real random variables
Two random variables X,Y are called uncorrelated if their covariance \operatorname{Cov}[X,Y]=\operatorname{E}[(X-\operatorname{E}[X]) (Y-\operatorname{E}[Y])] is zero. Formally:
|border
Definition for two complex random variables
Two complex random variables Z,W are called uncorrelated if their covariance \operatorname{K}{ZW}=\operatorname{E}[(Z-\operatorname{E}[Z])\overline{(W-\operatorname{E}[W])}] and their pseudo-covariance \operatorname{J}{ZW}=\operatorname{E}[(Z-\operatorname{E}[Z]) (W-\operatorname{E}[W])] is zero, i.e.
Z,W \text{ uncorrelated} \quad \iff \quad \operatorname{E}[Z\overline{W}] = \operatorname{E}[Z] \cdot \operatorname{E}[\overline{W}] \text{ and } \operatorname{E}[ZW] = \operatorname{E}[Z] \cdot \operatorname{E}[W]
Definition for more than two random variables
A set of two or more random variables X_1,\ldots,X_n is called uncorrelated if each pair of them is uncorrelated. This is equivalent to the requirement that the non-diagonal elements of the autocovariance matrix \operatorname{K}{\mathbf{X}\mathbf{X}} of the random vector \mathbf{X} = [X_1 \ldots X_n]^\mathrm{T} are all zero. The autocovariance matrix is defined as: :\operatorname{K}{\mathbf{X}\mathbf{X}} = \operatorname{cov}[\mathbf{X},\mathbf{X}] = \operatorname{E}[(\mathbf{X}-\operatorname{E}[\mathbf{X}])(\mathbf{X}-\operatorname{E}[\mathbf{X}]))^{\rm T}]= \operatorname{E}[\mathbf{X} \mathbf{X}^T] - \operatorname{E}[\mathbf{X}]\operatorname{E}[\mathbf{X}]^T
Examples of dependence without correlation
Main article: Correlation and dependence
Example 1
- Let X be a random variable that takes the value 0 with probability 1/2, and takes the value 1 with probability 1/2.
- Let Y be a random variable, independent of X, that takes the value −1 with probability 1/2, and takes the value 1 with probability 1/2.
- Let U be a random variable constructed as U=XY. The claim is that U and X have zero covariance (and thus are uncorrelated), but are not independent.
Proof:
Taking into account that :\operatorname{E}[U] = \operatorname{E}[XY] = \operatorname{E}[X] \operatorname{E}[Y] = \operatorname{E}[X] \cdot 0 = 0, where the second equality holds because X and Y are independent, one gets : \begin{align} \operatorname{cov}[U,X] & = \operatorname{E}[(U-\operatorname E[U])(X-\operatorname E[X])] = \operatorname{E}[ U (X-\tfrac12)] \ & = \operatorname{E}[X^2 Y - \tfrac12 XY] = \operatorname{E}[(X^2-\tfrac12 X)Y] = \operatorname{E}[(X^2-\tfrac12 X)] \operatorname E[Y] = 0 \end{align}
Therefore, U and X are uncorrelated.
Independence of U and X means that for all a and b, \Pr(U=a\mid X=b) = \Pr(U=a). This is not true, in particular, for a=1 and b=0.
- \Pr(U=1\mid X=0) = \Pr(XY=1\mid X=0) = 0
- \Pr(U=1) = \Pr(XY=1) = 1/4 Thus \Pr(U=1\mid X=0)\ne \Pr(U=1) so U and X are not independent.
Q.E.D.
Example 2
If X is a continuous random variable uniformly distributed on [-1,1] and Y = X^2, then X and Y are uncorrelated even though X determines Y and a particular value of Y can be produced by only one or two values of X :
f_X(t)= {1 \over 2} I_{[-1,1]} ; f_Y(t)= {1 \over {2 \sqrt{t}}} I_{]0,1]}
on the other hand, f_{X,Y} is 0 on the triangle defined by 0 although f_X \times f_Y is not null on this domain. Therefore f_{X,Y} (X,Y) \neq f_X (X) \times f_Y (Y) and the variables are not independent.
E[X] = {{1-1} \over 4} = 0 ; E[Y]= {{1^3 - (-1)^3}\over {3 \times 2} } = {1 \over 3}
Cov[X,Y]=E \left [(X-E[X])(Y-E[Y]) \right ] = E \left [X^3- {X \over 3} \right ] = =0
Therefore the variables are uncorrelated.
Generalizations
Uncorrelated random vectors
Two random vectors \mathbf{X}=(X_1,\ldots,X_m)^T and \mathbf{Y}=(Y_1,\ldots,Y_n)^T are called uncorrelated if :\operatorname{E}[\mathbf{X} \mathbf{Y}^T] = \operatorname{E}[\mathbf{X}]\operatorname{E}[\mathbf{Y}]^T.
They are uncorrelated if and only if their cross-covariance matrix \operatorname{K}_{\mathbf{X}\mathbf{Y}} is zero.
Two complex random vectors \mathbf{Z} and \mathbf{W} are called uncorrelated if their cross-covariance matrix and their pseudo-cross-covariance matrix is zero, i.e. if :\operatorname{K}{\mathbf{Z}\mathbf{W}}=\operatorname{J}{\mathbf{Z}\mathbf{W}}=0 where : \operatorname{K}{\mathbf{Z}\mathbf{W}} =\operatorname{E}[(\mathbf{Z}-\operatorname{E}[\mathbf{Z}]){(\mathbf{W}-\operatorname{E}[\mathbf{W}])}^{\mathrm H}] and : \operatorname{J}{\mathbf{Z}\mathbf{W}} =\operatorname{E}[(\mathbf{Z}-\operatorname{E}[\mathbf{Z}]){(\mathbf{W}-\operatorname{E}[\mathbf{W}])}^{\mathrm T}].
Uncorrelated stochastic processes
Two stochastic processes \left{X_t\right} and \left{Y_t\right} are called uncorrelated if their cross-covariance \operatorname{K}_{\mathbf{X}\mathbf{Y}}(t_1,t_2) = \operatorname{E} \left[ \left( X(t_1)- \mu_X(t_1) \right) \left( Y(t_2)- \mu_Y(t_2) \right) \right] is zero for all times. Formally:
:\left{X_t\right},\left{Y_t\right} \text{ uncorrelated} \quad :\iff \quad \forall t_1,t_2 \colon \operatorname{K}_{\mathbf{X}\mathbf{Y}}(t_1,t_2) = 0.
References
References
- Papoulis, Athanasios. (1991). "Probability, Random Variables and Stochastic Processes". MCGraw Hill.
- Kun Il Park, Fundamentals of Probability and Stochastic Processes with Applications to Communications, Springer, 2018, 978-3-319-68074-3
- [http://www.math.uah.edu/stat/expect/Covariance.html Virtual Laboratories in Probability and Statistics: Covariance and Correlation], item 17.
- (1992). "Introduction to Probability and Mathematical Statistics".
- Gubner, John A.. (2006). "Probability and Random Processes for Electrical and Computer Engineers". Cambridge University Press.
This article was imported from Wikipedia and is available under the Creative Commons Attribution-ShareAlike 4.0 License. Content has been adapted to SurfDoc format. Original contributors can be found on the article history page.
Ask Mako anything about Uncorrelatedness (probability theory) — get instant answers, deeper analysis, and related topics.
Research with MakoFree with your Surf account
Create a free account to save articles, ask Mako questions, and organize your research.
Sign up freeThis content may have been generated or modified by AI. CloudSurf Software LLC is not responsible for the accuracy, completeness, or reliability of AI-generated content. Always verify important information from primary sources.
Report