# Common Random Variables

Polynomial chaos expansion (PCE) is a Hilbert space technique for random variables with finite variance. Mathematically equivalent to Fourier series expansions for periodic signals, PCE allows to characterize a random variable in terms of its PCE coefficients (aka Fourier coefficients). That is, the PCE of a random variable $\mathsf{x}$ is given by

$$$\mathsf{x} = \sum_{i=0}^L x_i \phi_i,$$$

where $x_i$ are the so-called PCE coefficients, and $\phi_i$ are the orthogonal polynomials that are orthogonal relative to the probability density function of $\mathsf{x}$.

This tutorial walks you through the PCE of common random variables, namely Gaussian (gaussian), Beta (beta01), Uniform(uniform01), Logistic (logistic), and shows how they are implemented in PolyChaos.

## Construction of Basis

using PolyChaos

The orthogonal polynomials are constructed using the OrthoPoly-type (here of degree at most d). For canonical measures special constructors are implemented:

d = 6

myops = Dict()
polynames = ["gaussian", "beta01", "uniform01", "logistic"]

# gaussian
gaussian = GaussOrthoPoly(d);
myops["gaussian"] = gaussian

# beta01
α, β = 1.3, 2.2
beta01 = Beta01OrthoPoly(d,α,β);
myops["beta01"] = beta01

# uniform01
uniform01 = Uniform01OrthoPoly(d);
myops["uniform01"] = uniform01

# logistic
logistic = LogisticOrthoPoly(d);
myops["logistic"] = logistic;

myops
Dict{Any, Any} with 4 entries:
"beta01"    => Beta01OrthoPoly{Vector{Float64}, Beta01Measure, Quad{Float64, …
"gaussian"  => GaussOrthoPoly{Vector{Float64}, GaussMeasure, Quad{Float64, Ve…

For example, let's evaluate the Gaussian basis polynomials at some points

points, degrees = randn(10), 0:2:d

[ evaluate(degree,points,gaussian) for degree in degrees ]
4-element Vector{Vector{Float64}}:
[1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0]
[-0.9601239662521455, -0.8699125601541067, -0.15625124409256885, 1.1179764158636636, -0.41499317450587436, 0.7031083975118742, -0.05819187191864861, -0.45531948758712126, 0.07101847446015652, 2.4693039653329616]
[2.762333895580333, 2.2363981029302993, -1.3505805723492492, -5.222034397027291, -0.16780796709003942, -4.31807217139578, -1.763846218368009, 0.02859378612508401, -2.279030274125978, -5.779753788122758]
[-13.229366545554248, -9.397704900836997, 12.890689402581645, 22.522446887149027, 6.39202099176088, 23.071203796253048, 14.911713723191319, 5.222064253998134, 17.218167246169006, 2.3344137733267907]

## Finding PCE Coefficients

Having constructed the orthogonal bases, the question remains how to find the PCE coefficients for the common random variables. Every random variable can be characterized exactly by two PCE coefficients. For a Gaussian random variable, this is familiar: the mean and the variance suffice to describe a Gaussian random variable entirely. The same is true for any random variable of finite variance given the right basis. The function convert2affinePCE provides the first two PCE coefficients (hence the name affine) for the common random variables.

### Gaussian

Given the Gaussian random variable $\mathsf{x} \sim \mathcal{N}(\mu, \sigma^2)$ with $\sigma > 0$, the affine PCE coefficients are

# Gaussian
μ, σ = 2., 0.2
pce_gaussian = convert2affinePCE(μ,σ,gaussian)
2-element Vector{Float64}:
2.0
0.2

### Uniform

Given the uniform random variable $\mathsf{x} \sim \mathcal{U}(a, b)$ with finite support $a<b$, the affine PCE coefficients are

a, b = -0.3, 1.2
convert2affinePCE(a,b,uniform01)
2-element Vector{Float64}:
0.45
1.5

Instead, if the expected value and standard deviation are known, the affine PCE coefficients of the uniform random variable are

pce_uniform = convert2affinePCE(μ,σ,uniform01;kind="μσ")
# notice that the zero-order coefficient IS equivalent to the expected value μ
2-element Vector{Float64}:
2.0
0.6928203230275509

### Beta

Given the Beta random variable $\mathsf{x} \sim \mathcal{B}(a, b, \alpha, \beta)$ with finite support $a<b$ and shape parameters $\alpha, \beta > 0$, the affine PCE coefficients are

convert2affinePCE(a,b,beta01)
2-element Vector{Float64}:
0.2571428571428572
1.5

Instead, if the expected value and standard deviation are known, the affine PCE coefficients of the uniform random variable are

pce_beta = convert2affinePCE(μ,σ,beta01; kind="μσ")
2-element Vector{Float64}:
2.0
0.8780541105074454

### Logistic

Given the logstic random variable $\mathsf{x} \sim \mathcal{L}(a_1,a_2)$ where $a_2>0$ with the probability density function

$$$\rho(t) = \frac{1}{4 a_2} \, \operatorname{sech}^2 \left(\frac{t-a_1}{2a_2}\right)$$$

the affine PCE coefficients of the uniform random variable are

a1, a2 = μ, sqrt(3)*σ/pi
pce_logistic = convert2affinePCE(a1,a2,logistic)
2-element Vector{Float64}:
2.0
0.11026577908435842

## Moments

It is a key feature of PCE to compute moments from the PCE coefficients alone; no sampling is required.

### Gaussian

mean(pce_gaussian,myops["gaussian"]), std(pce_gaussian,myops["gaussian"])
(2.0, 0.2)

### Uniform

mean(pce_uniform,myops["uniform01"]), std(pce_uniform,myops["uniform01"])
(2.0, 0.19999999999999998)

### Beta

mean(pce_beta,myops["beta01"]), std(pce_beta,myops["beta01"])
(2.0, 0.20000000000000004)

### Logistic

mean(pce_logistic,myops["logistic"]), std(pce_logistic,myops["logistic"])
(2.0, 0.2)

## Sampling

Having found the PCE coefficients, it may be useful to sample the random variables. That means, find $N$ realizations of the random variable that obey the random variable's probability density function. This is done in two steps:

1. Draw $N$ samples from the measure (sampleMeasure()), and then
2. Evaluate the basis polynomials and multiply times the PCE coefficients, i.e. $\sum_{i=0}^L x_i \phi_i(\xi_j)$ where $\xi_j$ is the $j$-th sample from the measure (evaluatePCE()).

Both steps are combined in the function samplePCE().

### Gaussian

using Statistics
N = 1000
ξ_gaussian = sampleMeasure(N,myops["gaussian"])
samples_gaussian = evaluatePCE(pce_gaussian,ξ_gaussian,myops["gaussian"])
# samplePCE(N,pce_gaussian,myops["gaussian"])
1000-element Vector{Float64}:
1.7960418210346882
1.615078421636618
1.7599367197442775
2.290587417689159
2.1146091502016877
1.973940989117719
1.8835359556079259
1.888624060814861
1.8200171157338452
2.095642833326146
⋮
1.705037131820813
2.122829319132395
2.1459060494892856
1.8080823191947988
1.7648788461476501
2.085769268254475
1.7380606084129966
2.1633634217330795
1.9378849679691723

### Uniform

ξ_uniform = sampleMeasure(N,myops["uniform01"])
samples_uniform = evaluatePCE(pce_uniform,ξ_uniform,myops["uniform01"])
# samples_uniform = samplePCE(N,pce_uniform,myops["uniform01"])
1000-element Vector{Float64}:
2.067015922627985
2.211347063267529
1.7599259507648688
1.655387824535454
1.7951527954420956
1.6899548067036345
1.762633777103371
2.1169801446618988
1.7573307732529735
2.168284217018228
⋮
2.1845287382692615
2.1102028928376027
1.6598105525266418
2.2080975649251045
1.6826425685619388
2.188788768026447
2.0114114381112507
2.2629019074269663
1.7856439524849248

### Beta

ξ_beta = sampleMeasure(N,myops["beta01"])
samples_beta = evaluatePCE(pce_beta,ξ_beta,myops["beta01"])
# samples_beta = samplePCE(N,pce_beta,myops["beta01"])
1000-element Vector{Float64}:
1.8122355699913795
1.7920636855504835
2.077577632994394
2.028410533141458
1.8423299787474212
2.2151561771232724
2.1601219143057624
1.9575198021075237
2.1959615888814694
2.340581618120236
⋮
1.9520017963032068
2.0334040554161326
2.142535205612148
1.9145993994817565
1.893807443698718
2.2014674947168644
2.0106881010334274
1.8261436973598455
1.790383522936499

### Logistic

ξ_logistic = sampleMeasure(N,myops["logistic"])
samples_logistic = evaluatePCE(pce_logistic,ξ_logistic,myops["logistic"])
# samples_logistic = samplePCE(N,pce_logistic,myops["logistic"])
1000-element Vector{Float64}:
2.4277119954005375
2.6563840261119056
2.1150679846802447
1.7189226162119997
2.1619471597734083
1.8885400905283245
1.9780205382938547
2.2629454508852103
1.9983839293642227
2.1313183654306793
⋮
1.987486106367159
2.0826107947686703
1.7740664542811901
2.0706548561089977
2.179930719911253
2.0402809114877103
1.8790863125413741
1.9868928903023897
2.321375334479815