## Note on probability and statistics

### 一些定义

Variance $Var(X)=D(X)=E((X-E(X))^2)=E(X^2)-E^2(X)$

Covariance $Cov(X,Y)=E((X-E(X))(Y-E(Y)))=E(XY)-E(X)(Y)$

Aftermath队出征啦！

## Learn Tarjan the hard way

#### 约定与记号

$G=(\vartheta,\varepsilon )$为有向图

$G$通过dfs产生的生成树记为$T$

$(v,w)$$T$上的一条边记为$v\rightarrow w$

$T$上有一条从$v$$w$的路径记为$v\overset{*}{\rightarrow}w$

$T_{v}$是以$v$为根的$T$包含$v$的所有子孙的子树（当然也包括$v$

## KMP是什么？

#### Prelude

• prefix[S][l]$S$串的长度为$l$的前缀
• suffix[S][l]$S$串的长度为$l$的后缀

#### 孔乙己：你知道next array的四种写法吗？

• next[i]表示什么？
• $nx[i]=max(l)$，其中$l$满足$prefix[S][l]=suffix[prefix[S][i]][l]$并且$l<i$
• 那怎么写呢？容我三思……
• 不如先从网上抄个板子一套，反正题目过了就行QAQ

## Number Theory Review

$axiom:\\ \left.\begin{matrix} g|x\\ g|y\end{matrix}\right\}\Rightarrow \left\{\begin{matrix} g|x\pm y\\ g|xy \end{matrix}\right.\\ \left.\begin{matrix} g|x\\ h|x\\ gcd(g,h)=1 \end{matrix}\right\}\Rightarrow gh|x$

## Properties of Entropy

#### Shannon’s theory

• Theorem: $H(X,Y)\leq H(X)+H(Y)\\ let\ H(X)+H(Y)-H(X,Y)\\ =\sum_{x_{i}}Pr(x_{i})lb(\frac{1}{Pr(x_{i})})+\sum_{y_{j}}Pr(y_{j})lb(\frac{1}{Pr(y_{j})}-H(X,Y))\\ =\sum_{x_{i}}lb(\frac{1}{Pr(x_{i})})\sum_{y_{j}}Pr(x_{i},y_{j})+\sum_{y_{j}}lb(\frac{1}{P(y_{j})})\sum_{x_{i}}Pr(x_{i},y_{j})-H(X,Y)\\ =\sum_{x_{i},y_{j}}Pr(x_{i},y_{j})lb(\frac{1}{Pr(x_{i})Pr(y_{j})})-\sum_{x_{i},y_{j}}Pr(x{i},y_{j})lb(\frac{1}{Pr(x_{i},y_{j})})\\ =\sum_{x_{i},y_{j}}Pr(x_{i},y_{j})lb(\frac{Pr(x_{i},y_{j})}{Pr(x_{i})Pr(y_{j})})\\ =-\sum_{x_{i},y_{j}}Pr(x_{i},y_{j})lb(\frac{Pr(x_{i})Pr(y_{j})}{Pr(x_{i},y_{j})})\\$

## Good Will Hunting

As is illustrated in the Chinese edition title(心灵捕手), Sean not only hunts for the good will as a senior psychology professor but also undoes the knot in both Will’s and his heart. After reviewing the movie the third time, some fantastic lines and dialogues linger in my mind for such a long time that I chewed them over and over without languor.

Definitely Will’s marvelous talents in history, mathematics, literature and even organic chemistry can never be denied. He can defeat MIT professors in mathematics proof without any tough endeavor. He has the capability to quote large quantities of authentic literature to argue with Harvard students and ridicule them with the sentence “The education you got with fifteen thousand is accessible to those pay one point five to go to the public library”.

## FFT

$f(x)=a_{0}x^{0}+a_{1}x^{1}+...+a_{n-1}x^{n-1}$

$g(x)=b_{0}x^{0}+b_{1}x^{1}+...+b_{n-1}x^{n-1}$

$f(x)g(x)$

## 莫比乌斯反演

$\mu(d)=\left\{\begin{matrix}1&d=1\\ (-1)^{k}&d=p_{1}p_{2}...p_{k}\\\ 0&otherwise \end{matrix}\right.$