Boosting Effect of Classifier Based on Simple Granules of Knowledge
Abstract. (THIS IS THE FIRST VERSION, I WILL PROVIDE A CORRECTION IN A FEW DAYS TIME, AND THE RESULTS WILL BE DESCRIBED IN DETAIL)The idea of classification based on simple granules of knowledge (CSG classifier) is inspired by granular structures proposed by Polkowski. The simple granular classifier turned up to be really effective in the context of real data classification. Classifier among others turned out to be resistant for damages and can absorb missing values. In this work we have presented the continuation of series of experimentations with boosting of rough set classifiers. In this work we have checked a few methods for classifier stabilization in the context of CSG classifier - Bootstrap Ensemble (Simple Bagging), Boosting based on Arcing, and Ada-Boost with Monte Carlo split. We have performed experiments on selected data from the UCI Repository. For the smaller radii we have huge unprecise classificaiton granules, and the best result was obtain for the first radius after 0.5, where the size of classification granules was about 4,5 percent of size of original decision system. For the higher radii starting from 0.642857 classification granule size is really small and the classification is worse.
*extended version of paper „Ensemble of Classifiers Based on Simple Granules of Knowledge” presented at the 23rd International Conference on Information and Software Technologies (ICIST 2017) held on 12-14 October, 2017 in Druskininkai, Lithuania