MPI-INF Logo
Campus Event Calendar

Event Entry

What and Who

Feature Selection with Controlled Redundancy

RudrasisChakraborty
Indian Statistical Institute
PhD Application Talk
AG 1, AG 2, AG 3, AG 4, AG 5, SWS, RG1, MMCI  
AG Audience
English

Date, Time and Location

Monday, 27 May 2013
11:00
60 Minutes
E1 4
24
Saarbrücken

Abstract

We first present a feature selection method based on a multi-layer perceptron neural network, called Feature selection MLP (FSMLP). We explain how FSMLP can select essential features and discard derogatory and indifferent features. Such a method may pick up some useful but dependent (say correlated) features all of which may not be needed. We then propose a general scheme for dealing with this problem. The proposed scheme can select features with a controlled redundancy both for classification and function approximation/prediction type problems. The idea is general in nature and can be used with other learning schemes also. We demonstrate the effectiveness of the algorithm using several data sets including a synthetic data set. We also show that the selected features are adequate to solve the problem at hand. Here we have considered a measure of linear dependency to control the redundancy. The use of nonlinear measures of dependency, such as mutual information, is straightforward. We have also proposed a more effective new training scheme. Some advantages of the proposed scheme are: It does not require explicit evaluation of the feature subsets. Here feature selection is integrated into designing of the decision making system. Hence, it can look at all features together and pick up whatever is necessary. Our method can account for possible nonlinear subtle interactions between features, as well as that between features, tools and the problem being solved. It can also control the level of redundancy in the selected features.

Contact

--email hidden
passcode not visible
logged in users only

Aaron Alsancak, 05/21/2013 14:05
Aaron Alsancak, 05/21/2013 13:59 -- Created document.