[][src]Trait opencv::core::PCATrait

pub trait PCATrait {
    fn as_raw_PCA(&self) -> *const c_void;
fn as_raw_mut_PCA(&mut self) -> *mut c_void; fn eigenvectors(&mut self) -> Mat { ... }
fn set_eigenvectors(&mut self, mut val: Mat) { ... }
fn eigenvalues(&mut self) -> Mat { ... }
fn set_eigenvalues(&mut self, mut val: Mat) { ... }
fn mean(&mut self) -> Mat { ... }
fn set_mean(&mut self, mut val: Mat) { ... }
fn project(&self, vec: &dyn ToInputArray) -> Result<Mat> { ... }
fn project_to(
        &self,
        vec: &dyn ToInputArray,
        result: &mut dyn ToOutputArray
    ) -> Result<()> { ... }
fn back_project(&self, vec: &dyn ToInputArray) -> Result<Mat> { ... }
fn back_project_to(
        &self,
        vec: &dyn ToInputArray,
        result: &mut dyn ToOutputArray
    ) -> Result<()> { ... }
fn write(&self, fs: &mut FileStorage) -> Result<()> { ... }
fn read(&mut self, fn_: &FileNode) -> Result<()> { ... } }

Principal Component Analysis

The class is used to calculate a special basis for a set of vectors. The basis will consist of eigenvectors of the covariance matrix calculated from the input set of vectors. The class %PCA can also transform vectors to/from the new coordinate space defined by the basis. Usually, in this new coordinate system, each vector from the original set (and any linear combination of such vectors) can be quite accurately approximated by taking its first few components, corresponding to the eigenvectors of the largest eigenvalues of the covariance matrix. Geometrically it means that you calculate a projection of the vector to a subspace formed by a few eigenvectors corresponding to the dominant eigenvalues of the covariance matrix. And usually such a projection is very close to the original vector. So, you can represent the original vector from a high-dimensional space with a much shorter vector consisting of the projected vector's coordinates in the subspace. Such a transformation is also known as Karhunen-Loeve Transform, or KLT. See http://en.wikipedia.org/wiki/Principal_component_analysis

The sample below is the function that takes two matrices. The first function stores a set of vectors (a row per vector) that is used to calculate PCA. The second function stores another "test" set of vectors (a row per vector). First, these vectors are compressed with PCA, then reconstructed back, and then the reconstruction error norm is computed and printed for each vector. :

using namespace cv;
 
PCA compressPCA(const Mat& pcaset, int maxComponents,
               const Mat& testset, Mat& compressed)
{
   PCA pca(pcaset, // pass the data
           Mat(), // we do not have a pre-computed mean vector,
                   // so let the PCA engine to compute it
           PCA::DATA_AS_ROW, // indicate that the vectors
                                // are stored as matrix rows
                                // (use PCA::DATA_AS_COL if the vectors are
                                // the matrix columns)
           maxComponents // specify, how many principal components to retain
           );
   // if there is no test data, just return the computed basis, ready-to-use
   if( !testset.data )
       return pca;
   CV_Assert( testset.cols == pcaset.cols );
 
   compressed.create(testset.rows, maxComponents, testset.type());
 
   Mat reconstructed;
   for( int i = 0; i < testset.rows; i++ )
   {
       Mat vec = testset.row(i), coeffs = compressed.row(i), reconstructed;
       // compress the vector, the result will be stored
       // in the i-th row of the output matrix
       pca.project(vec, coeffs);
       // and then reconstruct it
       pca.backProject(coeffs, reconstructed);
       // and measure the error
       printf("%d. diff = %g\n", i, norm(vec, reconstructed, NORM_L2));
   }
   return pca;
}

See also

calcCovarMatrix, mulTransposed, SVD, dft, dct

Required methods

Loading content...

Provided methods

fn eigenvectors(&mut self) -> Mat[src]

eigenvectors of the covariation matrix

fn set_eigenvectors(&mut self, mut val: Mat)[src]

eigenvectors of the covariation matrix

fn eigenvalues(&mut self) -> Mat[src]

eigenvalues of the covariation matrix

fn set_eigenvalues(&mut self, mut val: Mat)[src]

eigenvalues of the covariation matrix

fn mean(&mut self) -> Mat[src]

mean value subtracted before the projection and added after the back projection

fn set_mean(&mut self, mut val: Mat)[src]

mean value subtracted before the projection and added after the back projection

fn project(&self, vec: &dyn ToInputArray) -> Result<Mat>[src]

Projects vector(s) to the principal component subspace.

The methods project one or more vectors to the principal component subspace, where each vector projection is represented by coefficients in the principal component basis. The first form of the method returns the matrix that the second form writes to the result. So the first form can be used as a part of expression while the second form can be more efficient in a processing loop.

Parameters

  • vec: input vector(s); must have the same dimensionality and the same layout as the input data used at %PCA phase, that is, if DATA_AS_ROW are specified, then vec.cols==data.cols (vector dimensionality) and vec.rows is the number of vectors to project, and the same is true for the PCA::DATA_AS_COL case.

fn project_to(
    &self,
    vec: &dyn ToInputArray,
    result: &mut dyn ToOutputArray
) -> Result<()>
[src]

Projects vector(s) to the principal component subspace.

The methods project one or more vectors to the principal component subspace, where each vector projection is represented by coefficients in the principal component basis. The first form of the method returns the matrix that the second form writes to the result. So the first form can be used as a part of expression while the second form can be more efficient in a processing loop.

Parameters

  • vec: input vector(s); must have the same dimensionality and the same layout as the input data used at %PCA phase, that is, if DATA_AS_ROW are specified, then vec.cols==data.cols (vector dimensionality) and vec.rows is the number of vectors to project, and the same is true for the PCA::DATA_AS_COL case.

Overloaded parameters

  • vec: input vector(s); must have the same dimensionality and the same layout as the input data used at PCA phase, that is, if DATA_AS_ROW are specified, then vec.cols==data.cols (vector dimensionality) and vec.rows is the number of vectors to project, and the same is true for the PCA::DATA_AS_COL case.
  • result: output vectors; in case of PCA::DATA_AS_COL, the output matrix has as many columns as the number of input vectors, this means that result.cols==vec.cols and the number of rows match the number of principal components (for example, maxComponents parameter passed to the constructor).

fn back_project(&self, vec: &dyn ToInputArray) -> Result<Mat>[src]

Reconstructs vectors from their PC projections.

The methods are inverse operations to PCA::project. They take PC coordinates of projected vectors and reconstruct the original vectors. Unless all the principal components have been retained, the reconstructed vectors are different from the originals. But typically, the difference is small if the number of components is large enough (but still much smaller than the original vector dimensionality). As a result, PCA is used.

Parameters

  • vec: coordinates of the vectors in the principal component subspace, the layout and size are the same as of PCA::project output vectors.

fn back_project_to(
    &self,
    vec: &dyn ToInputArray,
    result: &mut dyn ToOutputArray
) -> Result<()>
[src]

Reconstructs vectors from their PC projections.

The methods are inverse operations to PCA::project. They take PC coordinates of projected vectors and reconstruct the original vectors. Unless all the principal components have been retained, the reconstructed vectors are different from the originals. But typically, the difference is small if the number of components is large enough (but still much smaller than the original vector dimensionality). As a result, PCA is used.

Parameters

  • vec: coordinates of the vectors in the principal component subspace, the layout and size are the same as of PCA::project output vectors.

Overloaded parameters

  • vec: coordinates of the vectors in the principal component subspace, the layout and size are the same as of PCA::project output vectors.
  • result: reconstructed vectors; the layout and size are the same as of PCA::project input vectors.

fn write(&self, fs: &mut FileStorage) -> Result<()>[src]

write PCA objects

Writes @ref eigenvalues @ref eigenvectors and @ref mean to specified FileStorage

fn read(&mut self, fn_: &FileNode) -> Result<()>[src]

load PCA objects

Loads @ref eigenvalues @ref eigenvectors and @ref mean from specified FileNode

Loading content...

Implementors

impl PCATrait for PCA[src]

Loading content...