『度量学习』知识梳理

图片发自简书App

graph RL subgraph 0 a1[度量学习] --> |也称为马氏度量学习问题|b1[线性变换] a1[度量学习] --> b2[非线性变换] end subgraph 1 b1 --> c1[监督学习] c1 --> |该类型的算法充分利用数据的标签信息|d1[全局] c1 --> |该类型的算法同时考虑数据的标签信息和数据点之间的几何关系|d2[局部] end subgraph 2 b1 --> c2[非监督学习] end subgraph 3 d1 --> f1[ITML] d1 --> f2[MMC] d1 --> f3[MCML] end subgraph 4 d2 --> g1[NCA] d2 --> g2[LMNN] d2 --> g3[RCA] d2 --> g4[Local LDA] end subgraph 5 c2 --> e1[PCA] c2 --> e2[MDS] c2 --> e3[NMF] c2 --> e4[ICA] c2 --> e5[NPE] c2 --> e6[LPP] end subgraph 6 b2 --> b3[非线性降维] b2 --> b4[核方法] end subgraph 7 b3 --> h1[ISOMAP] b3 --> h2[LLE] b3 --> h3[LE] end subgraph 8 b4 --> t1[Non-Mahalanobis Local Distance Functions] b4 --> t2[Mahalanobis Local Distance Functions] b4 --> t3[Metric Learning with Neural Networks] end
  • ITML: Information-theoretic metric learning
  • MMC: Mahalanobis Metric Learning for Clustering
  • MCML: Maximally Collapsing Metric Learning
  • NCA: Neighbourhood Components Analysis
  • LMNN: Large-Margin Nearest Neighbors
  • RCA: Relevant Component Analysis
  • Local LDA: Local Linear Discriminative Analysis
  • PCA: Pricipal Components Analysis(主成分分析)
  • MDS: Multi-dimensional Scaling(多维尺度变换)
  • NMF: Non-negative Matrix Factorization(非负矩阵分解)
  • ICA: Independent components analysis(独立成分分析)
  • NPE: Neighborhood Preserving Embedding(邻域保持嵌入)
  • LPP: Locality Preserving Projections(局部保留投影)
  • ISOMAP: Isometric Mapping(等距映射)
  • LLE: Locally Linear Embedding(局部线性嵌入)
  • LE: Laplacian Eigenmap(拉普拉斯特征映射)

几篇经典论文

  • Distance metric learning with application to clustering with side-information
  • Information-theoretic metric learning(关于ITML)
  • Distance metric learning for large margin nearest neighbor classification(关于LMNN)
  • Learning the parts of objects by non-negative matrix factorization(Nature关于RCA的文章)
  • Neighbourhood components analysis(关于NCA)
  • Metric Learning by Collapsing Classes(关于MCML)
  • Distance metric learning a comprehensive survey(一篇经典的综述)

Python 封装了一些度量方法metric-learn

原文地址:https://www.cnblogs.com/q735613050/p/9369023.html