Infinite Feature Selection
@article{Roffo2015InfiniteFS, title={Infinite Feature Selection}, author={Giorgio Roffo and Simone Melzi and Marco Cristani}, journal={2015 IEEE International Conference on Computer Vision (ICCV)}, year={2015}, pages={4202-4210}, url={https://api.semanticscholar.org/CorpusID:3223980} }
A feature selection method exploiting the convergence properties of power series of matrices and introducing the concept of infinite feature selection (Inf-FS), which permits the investigation of the importance (relevance and redundancy) of a feature when injected into an arbitrary set of cues.
279 Citations
Supervised Infinite Feature Selection
- 2017
Computer Science
This paper builds upon the recently proposed Infinite Feature Selection (IFS) method where feature subsets of all sizes (including infinity) are considered, and proposes new ways of forming the feature adjacency matrix that perform better for unsupervised problems.
Watermelon: a Novel Feature Selection Method Based on Bayes Error Rate Estimation and a New Interpretation of Feature Relevance and Redundancy
- 2021
Computer Science, Mathematics
A novel feature selection method scoring the features through estimating the Bayes error rate based on kernel density estimation and updating the scores of features dynamically by quantitatively interpreting the effects of feature relevance and redundancy in a new way is proposed.
Infinite Latent Feature Selection: A Probabilistic Latent Graph-Based Ranking Approach
- 2017
Computer Science
A robust probabilistic latent graph-based feature selection algorithm that performs the ranking step while considering all the possible subsets of features, as paths on a graph, bypassing the combinatorial problem analytically is proposed.
An Optimal Multi-Level Backward Feature Subset Selection for Object Recognition
- 2019
Computer Science
A novel approach to select optimal features for object recognition based on multi-level backward feature subset selection (MLBFSS) algorithm is introduced, which performs better against state-of-the-art methods, verified using benchmark real-world databases.
Unsupervised Feature Ranking and Selection Based on Autoencoders
- 2019
Computer Science
A simple but efficient unsupervised feature ranking and selection method by exploiting the geometry of the original feature space using AutoEncoders and average reconstruction error of training samples by ignoring features, and the contribution of feature in the latent space (bottleneck of the auto-encoder) are proposed.
Evolutionary local improvement on genetic algorithms for feature selection
- 2017
Computer Science
A modified version of a genetic algorithm is proposed, introducing a novel local improvement approach based on evolution, which is able to obtain better dimensionality-accuracy trade-off.
Diverse Online Feature Selection
- 2018
Computer Science
This model aims to provide diverse features which can be composed in either a supervised or unsupervised framework, and demonstrates that this approach yields better compactness, is comparable and in some instances outperforms other state-of-the-art online feature selection methods.
Algorithmic stability and generalization of an unsupervised feature selection algorithm
- 2021
Computer Science
This paper proposes an innovative unsupervised feature selection algorithm attaining stability with provable guarantees and presents algorithmic stability analysis and shows that the algorithm has a performance guarantee via a generalization error bound.
Infinite Feature Selection: A Graph-based Feature Filtering Approach
- 2021
Computer Science, Mathematics
The results show that Inf-FS behaves better in almost any situation, that is, when the number of features to keep are fixed a priori, or when the decision of the subset cardinality is part of the process.
Multi-label feature selection using geometric series of relevance matrix
- 2022
Computer Science, Mathematics
The proposed method generates an adjacency matrix using pairwise features relevance and redundancy then uses the geometric series of the matrix as a part of the evaluation function, able to consider all possible subsets of feature space in evaluating a single candidate feature.
42 References
An Improved Feature Selection Based on Effective Range for Classification
- 2014
Computer Science
In IFSER, an including area (IA) is introduced to characterize the inclusion relation of effective ranges and the samples' proportion for each feature of every class in both OA and IA is taken into consideration, and the method outperforms the original ERGS and some other state-of-the-art algorithms.
Unsupervised feature selection by regularized self-representation
- 2015
Computer Science
Result Analysis of the NIPS 2003 Feature Selection Challenge
- 2004
Computer Science, Mathematics
The NIPS 2003 workshops included a feature selection competition organized by the authors, which took place over a period of 13 weeks and attracted 78 research groups and used a variety of methods for feature selection.
Generalized Fisher Score for Feature Selection
- 2011
Computer Science
Experiments indicate that the proposed generalized Fisher score to jointly select features outperforms Fisher score as well as many other state-of-the-art feature selection methods.
Book Review: Computational Methods of Feature Selection
- 2008
Computer Science, Mathematics
This book discusses Supervised, Unsupervised, and Semi-Supervised Feature Selection Key Contributions and Organization of the Book Looking Ahead Unsuper supervised Feature Selection.
Competitive baseline methods set new standards for the NIPS 2003 feature selection benchmark
- 2007
Computer Science, Mathematics
SVM Based Feature Selection: Why Are We Using the Dual?
- 2010
Computer Science, Mathematics
This work discusses some potential problems that arise when ranking features with the dual-based version of SVM-RFE and proposes a primal- based version of this well-known method, PSVM- RFE, which is able to produce a better detection of relevant features, in particular in situations involving non-linear decision boundaries.
Robust Feature Selection by Mutual Information Distributions
- 2002
Computer Science, Mathematics
A fast, newly defined method is shown to outperform the traditional approach based on empirical mutual information on a number of real data sets and a theoretical development is reported that allows one to efficiently extend the above methods to incomplete samples in an easy and effective way.
Feature Selection Stability Assessment Based on the Jensen-Shannon Divergence
- 2011
Computer Science, Mathematics
This generalized metric attempts to measure the disagreement among a whole set of lists with the same size, following a probabilistic approach and being able to give more importance to the differences that appear at the top of the list.