机器学习导论(张志华):主元分析
前言
这个笔记是北大那位老师课程的学习笔记,讲的概念浅显易懂,非常有利于我们掌握基本的概念,从而掌握相关的技术。
basic concepts
exp(−tz12)=∫exp(−tuz)dF(u)exp(-tz^{\frac{1}{2}}) =\int exp(-tuz) dF(u)exp(−tz21)=∫exp(−tuz)dF(u)
z=∣∣x∣∣2z=||x||^2z=∣∣x∣∣2
exp(−t∣∣x∣∣),exp(−t∣∣x∣∣).exp(-t||x||),exp(-t||x||).exp(−t∣∣x∣∣),exp(−t∣∣x∣∣).
The product of P.D is P.D
eul distance transformed into another space to get the distance.
∣∣ϕ(x)−ϕ(y)∣∣22||\phi(x)-\phi(y)||^2_2∣∣ϕ(x)−ϕ(y)∣∣22
Part2 unsuperrised learning
CB dimensionlity reduction.
PCA(Principal Component Analysis)
Population PCA
Def. if x‾⊂Rpisarandomvector,withmean:uandcovariancematrixσ\overline x \subset R^p \quad is\quad a\quad random \quad vector, \quad with \quad mean:u \quad and \quad covariance \quad matrix \sigmax⊂Rpisarandomvector,withmean:uandcovariancematrixσ
then the PCA is
x‾−>y‾=Ut(x−u)\overline x-> \overline y=U^t(x-u)x−>y=Ut(x−u)
when U is orthgonal.
Spectral Decompistion
Thm,
Ifx−>N(μ,σ)If x->N(\mu,\sigma)Ifx−>N(μ,σ) Then,y N(0,n)y~N(0,n)y N(0,n)
(2)E(y0)=0,E(y_0)=0,E(y0)=0,
(3)Cov(Ym,Yi)=0fori!=jCov(Y_m,Y_i)=0 for i !=j Cov(Ym,Yi)=0fori!=j
(4)yisaorthangonaltransformxisuncorrelationbutotsqure.y \quad is\quad a \quad orthangonal \quad transform \quad x \quad is \quad uncorrelation \quad but \quad ot \quad squre. yisaorthangonaltransformxisuncorrelationbutotsqure.
(5)Var(Yi)=σiVar(Y_i)=\sigma_iVar(Yi)=σi
Sample Principal Component
LetX=[x‾1...x‾n]Tbean∗pLet X=[\overline x_1 ...\overline x_n]^T be\quad a \quad n*p LetX=[x1...xn]Tbean∗p
sample data matrix
x‾=1n∑x=1nx‾i,\overline x=\frac{1}{n} \sum_{x=1}^n \overline x_i,x=n1x=1∑nxi,
S=1nXTHXS=\frac{1}{n}X^THXS=n1XTHX
H:In=1nInInH:I_n=\frac{1}{n}I_nI_nH:In=n1InIn
reduce the data to k-dimension ,you get the first k element.
keep most information,PCA.suppos.
SVD
U=eigenvectorof(AAT)U=eigenvectorof(AA^T)U=eigenvectorof(AAT)
D=AATD=\sqrt{AA^T}D=AAT
V=eigenvector(ATA)V=eigenvector(A^TA)V=eigenvector(ATA)
PCO(Principal Coordinate Analysis)
S=XTHXS=X^THXS=XTHX
power equal : HH=H
B=HXXTHB=HXX^THB=HXXTH
variance matrix
AB=BA
Non-zero eigenvector are equal.
- Principal components analysis(PCA):主元分析
- 主元分析(PCA)理论分析及应用
- 主元分析(PCA)理论分析及应用
- 主元分析(PCA)理论分析及应用
- 主元分析(PCA)理论分析及应用
- (六)K-means Clustering and Principal Component Analysis[K-means聚类&主元分析]
- 主元分析(PCA)原理
- 张志华教授《机器学习导论》和《统计机器学习》课程讲义
- PCA主元分析法人脸识别概述性小结
- 机器学习导论(张志华):概率PCA
- PCA(主元分析)的数学原理
- 《机器学习导论》和《统计机器学习》学习资料:张志华教授
- 主元分析(PCA)理论分析及应用
- 主元分析(PCA)在计算机图形学中的应用
- 机器学习导论(张志华):EM收敛性原理
- PCA主元分析法人脸识别概述性小结
- 张志华教授《机器学习导论》和《统计机器学习》课程讲义
- PCA主元分析法人脸识别概述性小结
- Principal Component Analysis 主元分析
- 张志华教授《机器学习导论》和《统计机器学习》课程讲义