Mnist数据集各种降维技术可视化(中)

Posted by Leo on May 28, 2019

接着上篇的内容

PCA

1
2
3
from sklearn.decomposition import PCA
X_pca_reduced = PCA(n_components=2, random_state=42).fit_transform(X)
plot_digits(X_pca_reduced, y)

核PCA(线性核)

1
2
3
from sklearn.decomposition import KernelPCA
lin_pca = KernelPCA(n_components = 2, kernel="linear").fit_transform(X)
plot_digits(lin_pca, y)

核PCA(高斯核)

1
2
rbf_pca = KernelPCA(n_components = 2, kernel="rbf", gamma=0.0433).fit_transform(X)
plot_digits(rbf_pca, y)

LLE

1
2
3
from sklearn.manifold import LocallyLinearEmbedding
X_lle_reduced = LocallyLinearEmbedding(n_components=2, random_state=42).fit_transform(X)
plot_digits(X_lle_reduced, y)

LDA

1
2
3
from sklearn.discriminant_analysis import LinearDiscriminantAnalysis
X_lda_reduced = LinearDiscriminantAnalysis(n_components=2).fit_transform(X,y)
plot_digits(X_lda_reduced, y)

MDS

1
2
3
from sklearn.manifold import MDS
X_mds_reduced = MDS(n_components=2, random_state=42).fit_transform(X)
plot_digits(X_mds_reduced, y)

Isomap

1
2
3
from sklearn.manifold import Isomap
X_reduced_isomap = Isomap(n_components=2).fit_transform(X)
plot_digits(X_reduced_isomap, y)

ICA

1
2
3
from sklearn.decomposition import FastICA 
X_reduced_ICA = FastICA(n_components=2).fit_transform(X)
plot_digits(X_reduced_ICA, y)

TSNE

1
2
3
4
5
6
7
8
from sklearn.manifold import TSNE
tsne = TSNE(n_components=2, random_state=42)
X_reduced_tsne = tsne.fit_transform(X)
plt.figure(figsize=(13,10))
plt.scatter(X_reduced_tsne[:, 0], X_reduced_tsne[:, 1], c=y, cmap="jet")
plt.axis('off')
plt.colorbar()
plt.show()

UMAP

1
2
3
import umap.umap_ as umap
umap_data = umap.UMAP(n_neighbors=5, min_dist=0.3, n_components=2).fit_transform(X)
plot_digits(umap_data, y)

本文采用知识共享署名-非商业性使用-禁止演绎 4.0 国际许可协议(CC BY-NC-ND 4.0)进行许可,转载请注明出处,请勿用于任何商业用途采用。

☛决定关注我了吗☚