Probabilistic principal component analysis

更新时间:2023-05-27 19:21:15 阅读: 评论:0

Probabilistic
Principal Component Analysis Michael E.Tipping Christopher M.Bishop
Abstract花灯怎么画
Principal component analysis(PCA)is a ubiquitous technique for data analysis
and processing,but one which is not bad upon a probability model.In this pa-
per we demonstrate how the principal axes of a t of obrved data vectors may
be determined through maximum-likelihood estimation of parameters in a latent
海内指什么variable model cloly related to factor analysis.We consider the properties of the
associated likelihood function,giving an EM algorithm for estimating the princi-
pal subspace iteratively,and discuss,with illustrative examples,the advantages
conveyed by this probabilistic approach to PCA.
公路水运试验检测
Keywords:Principal component analysis;probability model;density estimation;
maximum-likelihood;EM algorithm;Gaussian mixtures.
Published as:“Probabilistic Principal Component Analysis”,Journal of the Royal Statistical Society,Series B,61,Part3,pp.611–622.
t)(t n t is the data sample mean,such that Sw j j w j.The q principal components of the obrved vector t n are given by the vector x n W T(t n
t),the principal compo-nent projection minimis the squared reconstruction error∑n t nˆt n2,where the optimal linear reconstruction of t n is given byˆt n Wx n
d ln(2)ln C tr C1S(4)
市长大人你好2
where
1
鸭肉饭S
d q
王者上分
d
j q1
j
(8)
which has a clear interpretation as the variance‘lost’in the projection,averaged over the lost dimensions.
In practice,tofind the most likely model given S,we wouldfirst estimate2ML from(8),and then W ML from(7),where for simplicity we would effectively ignore hoo R I).Alterna-tively,we might employ the EM algorithm detailed in Appendix B,where R at convergence can be considered arbitrary.
访谈节目3.3Factor Analysis Revisited
Although the above estimators result from application of a simple constraint to the standard fac-tor analysis model,we note that an important distinction resulting from the u of the isotropic noi covariance2I is that PPCA is covariant under rotation of the original data axes,as is stan-dard PCA,while factor analysis is covariant under component-wi rescaling.Another point of contrast is that in factor analysis,neither of the factors found by a two-factor model is necessar-ily the same as that found by a single-factor model.In probabilistic PCA,we e above that the principal axes may be found incrementally.
3.4Dimensionality Reduction
The general motivation for PCA is to transform the data into some reduced-dimensionality repre-ntation,and with some minor algebraic manipulation of W ML,we may indeed obtain the stan-dard projection onto the principal axes if desired.However,it is more natural from a probabilistic perspective to consider the dimensionality-reduction process in terms of the distribution of the latent variables,conditioned on the obrvation.From(6),this distribution may be conveniently summarid by its mean:
还有的英文
x n t n M1W ML T(t n)(9) (Note,also from(6),that the corresponding conditional covariance is given by2ML M1and is thus independent of n.)It can be en that when20,M1(W ML T W ML)1and(9)then rep-
rents an orthogonal projection into latent space and so standard PCA is recovered.However, the density model then becomes singular,and thus undefined.In practice,with20as deter-mined by(8),the latent projection becomes skewed towards the origin as a result of the Gaussian marginal distribution for x.Becau of this,the reconstruction W ML x n t n is not an orthog-onal projection of t n,and is therefore not optimal(in the squared reconstruction-error n). Nevertheless,optimal reconstruction of the obrved data from the conditional latent mean may still be obtained,in the ca of20,and is given by W ML(W ML T W ML)1M x n t n.

本文发布于:2023-05-27 19:21:15,感谢您对本站的认可!

本文链接:https://www.wtabcd.cn/fanwen/fan/82/794300.html

版权声明:本站内容均来自互联网,仅供演示用,请勿用于商业和其他非法用途。如果侵犯了您的权益请与我们联系,我们将在24小时内删除。

标签:试验   市长   访谈   水运   检测   大人   节目
相关文章
留言与评论(共有 0 条评论)
   
验证码:
推荐文章
排行榜
Copyright ©2019-2022 Comsenz Inc.Powered by © 专利检索| 网站地图