主 题: Randomized Algorithms in Dimensionality Reduction
报告人: Prof. Jianzhong Wang (Sam Houston State University)
时 间: 2011-06-03 14:00-15:00
地 点: 理科一号楼1114(数学所活动) 
  
 Dimensionality reduction (DR) is a useful tool in machine learning 
  
 and compressive sensing. In data processing, when dimensions of the data 
  
 are very high, we meet the so-called curse of dimensionality so that most existent 
  
 data processing system cannot deal with them. The role of dimensionality 
  
 reduction it to transform high-dimensional data to low-dimensional ones. 
  
 The geometric (or spectral) approach to DR is based on manifold learning, in 
  
 which a high-dimensional data is modeled as a sample set on a low-dimensional 
  
 manifold. Then a DR kernel is constructed to characterize the geometry of 
  
 the underlying manifold. Therefore, DR is realized by the spectral decomposition 
  
 of the kernel. Many existent DR methods, such as Isomaps, Maximum 
  
 Variance Unfolding (MVU), Locally Linear Embedding (LLE), Local Tangent 
  
 Space Alignment (LTSA), Hessian Locally Linear Embedding (HLLE), Laplacian 
  
 Eigenmaps (Leigs), and Di usion Maps (Dmaps), adopt this approach. 
  
 However, these methods are limited by the cardinality of the sample set. In this 
  
 presentation, we introduce randomized algorithms to deal with the large-size 
  
 high-dimensional data. Random project and randomized Nystrom extension 
  
 are applied to signi cantly reduce the size of the kernel so that the randomized 
  
 fast algorithms are developed.