<< Chapter < Page | Chapter >> Page > |
When the dataset is nonlinear, kernel trick (it does a nonlinear mapping from the original space to an inner product space so that the observations will gain meaningful linear structure in the new space) can be combined with PCA to achieve a non-linear dimensionality reduction (called Kernel PCA).
In some cases, if we want a sparse set of coefficients (weights in the linear combination of variables equal zero), Sparse PCA can help us with that.
In other cases, if we want to find clusters where columns and rows can be grouped together as a cluster (think bi-clustering), two-way PCA can be helpful.
Notification Switch
Would you like to follow the 'Comparison of three different matrix factorization techniques for unsupervised machine learning' conversation and receive update notifications?