Oblique Decision Tree Algorithm with Minority Condensation for Class Imbalanced Problem
In recent years, a significant issue in classification is to handle a dataset containing imbalanced number of instances in each class. Classifier modification is one of the well-known techniques to deal with this particular issue. In this paper, the effective classification model based on an oblique decision tree is enhanced to work with the imbalanced datasets that is called oblique minority condensed decision tree (OMCT). Initially, it selects the best axis-parallel hyperplane based on decision tree algorithm using the minority entropy of instances within the minority inner fence selection. Then it perturbs this hyperplane along each axis to improve its minority entropy. Finally, it stochastically perturbs this hyperplane to escape the local solution. From the experimental results, OMCT significantly outperforms 6 state-of-the-art decision tree algorithms that are CART, C4.5, OC1, AE, DCSM and ME on 18 real-world datasets from UCI in term of precision, recall and F1 score. Moreover, the size of decision tree from OMCT is significantly smaller than others.
Authors who publish with Engineering Journal agree to transfer all copyright rights in and to the above work to the Engineering Journal (EJ)'s Editorial Board so that EJ's Editorial Board shall have the right to publish the work for nonprofit use in any media or form. In return, authors retain: (1) all proprietary rights other than copyright; (2) re-use of all or part of the above paper in their other work; (3) right to reproduce or authorize others to reproduce the above paper for authors' personal use or for company use if the source and EJ's copyright notice is indicated, and if the reproduction is not made for the purpose of sale.