Conditional Mean, Variance, and MMSE
In this blog post, we’ll explore the connections among conditional mean, variance, conditional Probability Density Functions (PDFs), and also its relationship with Minimum Mean Square Error (MMSE) estimator.
Conditional Probability Density Function
Consider two random variables,
Conditioning on a Realization
When conditioned on a specific realization
Conditioning on a Random Variable
When conditioned on the random variable
MMSE Estimator
The MMSE estimator is defined as:
For a specific realization
The MSE for a particular realization
However, our primary interest lies in the MSE of the estimator across all realizations of
An intriguing result emerges when exploring the relationship between the expectation of conditional variance and the variance of conditional expectation:
Proof:
MMSE Estimator for Linear Model
As a supplementary note, when dealing with random vectors (represented by lowercase bold letters for random vectors), consider a scenario with a Gaussian prior on the signal vector
The posterior density of
Here,
and
Note that in both expressions, the alternative expressions are applicable when covariances of the information signal vector and noise are invertible. Furthermore, we opted to use the notation
Remarkably, the conditional covariance is independent of
In conclusion, understanding the interplay between conditional mean, variance, and conditional PDFs provides valuable insights into the MMSE estimator. Exploring these concepts, particularly in scenarios like the MMSE estimator for a linear model with Gaussian prior and independent additive Gaussian noise, sheds light on the relationship between the posterior density, conditional mean, covariance and MSE of MMSE estimator.