Toward Optimal Feature Selection to Improve Classification Accuracy

Omar A. A. Shiba

Abstract

The representation of data in the fields of data mining and machine learning often uses many features, only a few of which may be related to the target concept. Many feature subset selection (FSS) algorithms have been proposed, but not all of them are appropriate for a given feature selection problem. The purpose of this paper is to eliminate the number of features by removing irrelevant once. Choosing a subset of the features may improve the performance of classification accuracy. To achieve this purpose, slicing technique used to reduce the number of features in some selected datasets. The experimental results indicate that the performance of proposed feature selection scheme is very good compared with other approaches which are RELIEF with Base Learning Algorithm (C4.5), RELIEF with K-Nearest Neighbour (K-NN) and RELIEF with Induction of Decision Tree Algorithm (ID3), which are mostly used in the feature selection task.

Full text article

Generated from XML file

Authors

Omar A. A. Shiba
Toward Optimal Feature Selection to Improve Classification Accuracy. (2019). Journal of Pure & Applied Sciences , 18(4). https://doi.org/10.51984/jopas.v18i4.405

Article Details

How to Cite

Toward Optimal Feature Selection to Improve Classification Accuracy. (2019). Journal of Pure & Applied Sciences , 18(4). https://doi.org/10.51984/jopas.v18i4.405

Similar Articles

You may also start an advanced similarity search for this article.

No Related Submission Found