Publication:
Feature selection via computational intelligence techniques

dc.contributor.authorsAlgin, Ramazan; Alkaya, Ali Fuat; Agaoglu, Mustafa
dc.date.accessioned2022-03-12T22:40:23Z
dc.date.accessioned2026-01-11T06:10:53Z
dc.date.available2022-03-12T22:40:23Z
dc.date.issued2020
dc.description.abstractFeature selection (FS) has become an essential task in overcoming high dimensional and complex machine learning problems. FS is a process used for reducing the size of the dataset by separating or extracting unnecessary and unrelated properties from it. This process improves the performance of classification algorithms and reduces the evaluation time by enabling the use of small sized datasets with useful features during the classification process. FS aims to gain a minimal feature subset in a problem domain while retaining the accuracy of the original data. In this study, four computational intelligence techniques, namely, migrating birds optimization (MBO), simulated annealing (SA), differential evolution (DE) and particle swarm optimization (PSO) are implemented for the FS problem as search algorithms and compared on the 17 well-known datasets taken from UCI machine learning repository where the dimension of the tackled datasets vary from 4 to 500. This is the first time that MBO is applied for solving the FS problem. In order to judge the quality of the subsets generated by the search algorithms, two different subset evaluation methods are implemented in this study. These methods are probabilistic consistency-based FS (PCFS) and correlation-based FS (CFS). Performance comparison of the algorithms is done by using three well-known classifiers; k-nearest neighbor, naive bayes and decision tree (C4.5). As a benchmark, the accuracy values found by classifiers using the datasets with all features are used. Results of the experiments show that our MBO-based filter approach outperforms the other three approaches in terms of accuracy values. In the experiments, it is also observed that as a subset evaluator CFS outperforms PCFS and as a classifier C4.5 gets better results when compared to k-nearest neighbor and naive bayes.
dc.identifier.doi10.3233/JIFS-189090
dc.identifier.eissn1875-8967
dc.identifier.issn1064-1246
dc.identifier.urihttps://hdl.handle.net/11424/235952
dc.identifier.wosWOS:000595520600020
dc.language.isoeng
dc.publisherIOS PRESS
dc.relation.ispartofJOURNAL OF INTELLIGENT & FUZZY SYSTEMS
dc.rightsinfo:eu-repo/semantics/closedAccess
dc.subjectFeature selection
dc.subjectcomputational intelligence
dc.subjectdimensionality reduction
dc.subjectmeta-heuristics
dc.subjectclassification algorithms
dc.subjectsubset evaluators
dc.subjectPARTICLE SWARM OPTIMIZATION
dc.subjectFEATURE SUBSET-SELECTION
dc.subjectDIFFERENTIAL EVOLUTION
dc.subjectROUGH SETS
dc.subjectSEARCH
dc.subjectALGORITHMS
dc.titleFeature selection via computational intelligence techniques
dc.typearticle
dspace.entity.typePublication
oaire.citation.endPage6216
oaire.citation.issue5
oaire.citation.startPage6205
oaire.citation.titleJOURNAL OF INTELLIGENT & FUZZY SYSTEMS
oaire.citation.volume39

Files