Skip to article frontmatterSkip to article content
Site not loading correctly?

This may be due to an incorrect BASE_URL configuration. See the MyST Documentation for reference.

Stochastic Processes — Time-Indexed Randomness

DhvaniAI

Status: placeholder. This page will become the dedicated treatment of stochastic processes, covering everything that random experiments (the subject of part0_what_is_a_distribution.md) deliberately leaves out: time indexing, autocorrelation, stationarity, power spectral density, and ergodicity.

The main probability series (parts 1–6 + applied_sensors) only needs single random variables, so this material is not on the critical path for those parts. Read it after part 0 once the question “what about a whole window of samples, not just one?” starts to matter — typically when you reach Part II (Signals and Measurement) or any time-series work.


Why This Page Exists

Part 0 introduces the random process (single trial, no time index) and notes in passing that a stochastic process is the time-indexed version. That distinction is enough to keep parts 1–6 honest, but it leaves real questions on the table:

Each of these needs the language of stochastic processes. This page builds that language from first principles.


Intended Outline

The sections below are placeholders. Each will follow the same applied-first pattern as the rest of the probability series: one concrete sensor scenario → the math needed to describe it → numbers → cross-domain examples.

1. From Random Variable to Stochastic Process

2. Mean and Variance Over Time

3. Autocorrelation — Are Successive Samples Independent?

4. Stationarity

5. Power Spectral Density (PSD)

6. Ergodicity

7. Common Stochastic Processes

A short tour of the named processes that show up everywhere in signal processing and ML:

8. Where Stochastic Processes Show Up Downstream

AreaWhat stochastic processes give you
Vibration / condition monitoringPSD-based fault detection; stationarity checks for “is the machine state changing?”
AudioPSD, spectrograms, noise modelling for speech enhancement
Image sensorsFixed-pattern noise vs read noise vs shot noise — different temporal correlation structure
Time series forecastingStationarity is a prerequisite for ARIMA; autocorrelation drives model order
Reinforcement learningMarkov decision processes, return distributions
Diffusion generative modelsThe forward process is a Wiener process; the reverse process is a learned SDE
Kalman filteringLinear-Gaussian stochastic processes are the entire substrate

Until This Page Is Filled In

The vocabulary table below is the minimum useful summary. It’s the same one in part0_what_is_a_distribution.md §6 — repeated here so this file is self-contained.

TermWhat it meansTime involved?
Random process / experimentAny procedure with an uncertain outcomeNo
Stochastic processA collection of random variables indexed by time: {X(t):tT}\{X(t) : t \in T\}Yes
Time seriesObserved data from a stochastic processYes