中文
Published date:2014-07-16    Provided by:School of Science
 
Title: Convex Optimization Learning of Faithful Euclidean Distances in Nonlinear Dimensionality Reduction
Guest SpeakerProfessor Qi Houduo, Southampton of University
Time2014-7-4, 9:30 – 10:30
LocationSY102
Content &Introduction 
It is known that the classical multidimensional scaling as a dimensionality reduction method only works well when the noisy distances observed in a high dimensional space can be faithfully represented by Euclidean distances in a low dimensional space. Advanced models such as Maximum Variance Unfolding (MVU) and Minimum Volume Embedding (MVE) takes advantage of Semi-Definite Programming(SDP) to reconstruct such faithful representations. While those SDP models are capable of producing high quality configuration numerically, they suffer two major drawbacks. One is that there exist no theoretically guaranteed bounds on the quality of the configuration. The other is that they are slow in computation when the data points are beyond a thousand. In this paper, we propose a convex optimization model of Euclidean distance matrix instead of positive semidefinite matrix in SDP models. We establish a non-asymptotic reconstruction error bound for the random graph model with sub-exponential noise. We prove that, up to a logarithmic factor, our estimator achieve optimal rates with respect to the reconstruction error. Moreover, we develop a fast inexact accelerated proximal point method for the model. Numerical experiments show that the model can produce configurations of high quality on large data points that the SDP approach would struggle to cope with.