Search Results

Loading...
Thumbnail Image
Item

Pattern recognition via projection – based k – NN rules

We introduce a new procedure for pattern recognition, based on the concepts of random projections and nearest neighbors. It can be thought as an improvement of the classical nearest neighbors classification rules. Besides the concept of neighbors we introduce the notion of district, a larger set which will be projected. Then we apply one dimensional k-NN methods to the projected data on randomly selected directions. In this way we are able to provide a method with some robustness properties and more accurate to handle high dimensional data. The procedure is also universally consistent. We challenge the method with the Isolet data where we obtain a very high classification score.

Loading...
Thumbnail Image
Item

Selection of variables for cluster analysis and classification rules

Loading...
Thumbnail Image
Item

Principal components for multivariate functional data

A principal component method for multivariate functional data is proposed. Data can be arranged in a matrix whose elements are functions so that for each individual a vector of p functions is observed. This set of p curves is reduced to a small number of transformed functions, retaining as much information as possible. The criterion to measure the information loss is the integrated variance. Under mild regular conditions, it is proved that if the original functions are smooth this property is inherited by the principal components. A numerical procedure to obtain the smooth principal components is proposed and the goodness of the dimension reduction is assessed by two new measures of the proportion of explained variability. The method performs as expected in various controlled simulated data sets and provides interesting conclusions when it is applied to real data sets.