Feature transformation techniques reduce the dimensionality in the data by transforming data into new features. Feature selection techniques are preferable when transformation of variables is not possible, e.g., when there are categorical variables in the data. For a feature selection technique that is specifically suitable for least-squares fitting, see Stepwise Regression.
Perform feature selection that is robust to outliers using a custom robust loss function in NCA.
Neighborhood component analysis (NCA) is a non-parametric and embedded method for selecting features with the goal of maximizing prediction accuracy of regression and classification algorithms.
This example shows a complete workflow for feature extraction from image data.
This example shows how to use
disentangle mixed audio signals.
Feature extraction is a set of methods to extract high-level features from data.
This example shows how t-SNE creates a useful low-dimensional embedding of high-dimensional data.
This example shows the effects of various
t-SNE is a method for visualizing high-dimensional data by nonlinear reduction to two or three dimensions, while preserving some features of the original data.
Output function description and example for t-SNE.
Perform a weighted principal components analysis and interpret the results.
This example shows how to apply Partial Least Squares Regression (PLSR) and Principal Components Regression (PCR), and discusses the effectiveness of the two methods.
Principal Component Analysis reduces the dimensionality of data by replacing several correlated variables with a new set of variables that are linear combinations of the original variables.
Use factor analysis to investigate whether companies within the same sector experience similar week-to-week changes in stock prices.
Factor analysis is a way to fit a model to multivariate data to estimate interdependence of measured variables on a smaller number of unobserved (latent) factors.
Perform nonnegative matrix factorization using the multiplicative and alternating least-squares algorithms.
perform classical (metric) multidimensional scaling, also known as
principal coordinates analysis.
Multidimensional scaling allows you to visualize how near points are to each other for many kinds of distance or dissimilarity metrics and can produce a representation of data in a small number of dimensions.
Perform nonclassical multidimensional scaling using
Use Procrustes analysis to compare two handwritten numerals.
Procrustes analysis minimizes the differences in location between compared landmark data using the best shape-preserving Euclidian transformations