Two step feature selection: Approximate functional dependency approach using membership values

2004-07-29
Uncu, O
Turksen, IB
Feature selection is one of the most important issues in fields such as system modelling and pattern recognition. In this study, a new feature selection algorithm that combines feature wrapper and feature filter approaches is proposed in order to identify the significant input variables in systems with continuous domains. The proposed method utilizes functional dependency concept and K-Nearest Neighbourhood method to implement the feature filter and feature wrapper, respectively. It is common to have outliers and noise in real-life data. In order to make the proposed feature selection algorithm noise and outlier resistant, approximate functional dependencies are used by utilizing membership values that inherently cope with uncertainty in the data.

Suggestions

Rule-by-rule input significance analysis in fuzzy system modeling
Uncu, O; Turksen, IB (2004-06-30)
Input or feature selection is one the most important steps of system modeling. Elimination of irrelevant variables can save time, money and can improve the precision of model that we are trying to discover. In Fuzzy System Modeling (FSM) approaches, input selection plays an important role too. The input selection algorithms that are under our investigation did not consider one crucial fact. An input variable may of may not be significant in a specific rule, not in overall system. In this paper, an input sel...
Interactive evolutionary approaches to multiobjective feature selection
ÖZMEN, müberra; Karakaya, Gülşah; KÖKSALAN, MUSTAFA MURAT (Wiley, 2018-05-01)
In feature selection problems, the aim is to select a subset of features to characterize an output of interest. In characterizing an output, we may want to consider multiple objectives such as maximizing classification performance, minimizing number of selected features or cost, etc. We develop a preference-based approach for multiobjective feature selection problems. Finding all Pareto-optimal subsets may turn out to be a computationally demanding problem and we still would need to select a solution. There...
Identifying (Quasi) Equally Informative Subsets in Feature Selection Problems for Classification: A Max-Relevance Min-Redundancy Approach
Karakaya, Gülşah; AHİPAŞAOĞLU, Selin Damla; TAORMİNA, Riccardo (2016-06-01)
An emerging trend in feature selection is the development of two-objective algorithms that analyze the tradeoff between the number of features and the classification performance of the model built with these features. Since these two objectives are conflicting, a typical result stands in a set of Pareto-efficient subsets, each having a different cardinality and a corresponding discriminating power. However, this approach overlooks the fact that, for a given cardinality, there can be several subsets with sim...
A cluster tree based model selection approach for logistic regression classifier
Tanju, Ozge; Kalaylıoğlu Akyıldız, Zeynep Işıl (Informa UK Limited, 2018-01-01)
Model selection methods are important to identify the best approximating model. To identify the best meaningful model, purpose of the model should be clearly pre-stated. The focus of this paper is model selection when the modelling purpose is classification. We propose a new model selection approach designed for logistic regression model selection where main modelling purpose is classification. The method is based on the distance between the two clustering trees. We also question and evaluate the performanc...
Step down logistic regression for feature selection
Baykal, Nazife (1997-08-01)
This paper proposes a methodology to the feature selection problem of pattern classification problems. For this purpose, pattern recognition or signal processing involves two major tasks: clustering transformation and then, feature selection. The concept of clustering reduces the dimensionality of the measurement space and generates a set of features. However, there is so far no covering theory how to select discriminative and biologically important features from the pool of generated features. This paper d...
Citation Formats
O. Uncu and I. Turksen, “Two step feature selection: Approximate functional dependency approach using membership values,” 2004, p. 1643, Accessed: 00, 2020. [Online]. Available: https://hdl.handle.net/11511/65789.