The characteristic function of a real random variable X is a function that encodes all the information about the probability distribution of X. In other words, the characteristic function
uniquely determines the distribution of X.
The characteristic function φ_X(u) of a random variable X is defined as:
φ_X(u) = E[exp(i * u * X)],
where:
- E[.] represents the expectation (or mean) with respect to the distribution of X.
- u is any real number ( in stochastic processes, u might be time) acting as a "frequency" parameter in the context of Fourier analysis.
- i is the imaginary unit (i² = -1).
Therefore, the characteristic function takes each possible value of u and calculates the mean of the complex exponential associated with the random variable X.
In the characteristic function, u acts as a parameter that "probes" or "transforms" the distribution of X.
The characteristic function has the remarkable property of uniquely determining the probability distribution of X. This means that if we know the characteristic function, we know the complete
distribution of the random variable.
If the random variable X has a probability density function f_X(x) [1], then the characteristic function is linked to this density via the Fourier transform:
φ_X(u) = ∫ from -∞ to +∞ [exp(i * u * x) * f_X(x) dx].
And we can invert this relation: the density f_X(x) can be recovered from the characteristic function using the inverse Fourier transform:
f_X(x) = (1 / 2π) * ∫ from -∞ to +∞ [exp(-i * u * x) * φ_X(u) du].
If X is a random variable, the k-th moment, if it exists, is defined as the real number E(X^k). For example, the expectation (or mean) of a random variable is its first moment and the variance of
a random variable is its centered moment of order 2.
If X is a finite discrete random variable, its k-th moment is calculated using the formula:
m_k
= Σ (x_i^k * P(X = x_i)), where the summation is over all possible values of X.
If
X has a probability density function f, the k-th moment is given by:
m_k
= ∫ over all R (x^k * f(x) dx), where the integral is taken over the entire real line.
By taking derivatives around u = 0, we can systematically retrieve all the moments (mean, variance, skewness, etc.) without needing to integrate over the original probability density, making it
easier to analyze the distribution.
The exponential, exp(i * u * X), is fundamental in Fourier analysis, allowing the distribution of X to be transformed into the frequency domain.
The exponential function is also easy to differentiate and integrate, helps in simplifying calculations involving sums and products of random variables.
[1] A Probability Density Function (PDF), noted as f_X(x), is a function that indicates how likely it is for a continuous random variable X to take specific values.
Écrire commentaire