The innovation of Machine Learning (ML) techniques is evolving from basic techniques to optimized techniques, considerably improving the performance of prediction models. In the proposed work, the study primarily explores fundamental ML classification methods to classify banking customers based on their credit information. The classification of customers targets five categories: Outstanding, Excellent, Good, Satisfactory, and Bad. The aim is to assess the profitable customer categories and gain successful business by offering resources. The basic classification algorithms used in the proposed work are K-Nearest Neighbour (K-NN), Support Vector Machines (SVM), Decision Tree (DT), and Random Forest (RF) Classifiers. Using standard evaluation metrics, the performance of the classifiers are evaluated. Based on the metrics the comparative analysis is conducted, and comprehended the performance metrics need to be elevated. To manipulate this, Hyperparameter GridSearchCV (HGSCV) optimization is adopted, which is putative for its exhaustive search capabilities. However, the present accuracy scores of algorithms could be slightly improved while applying the HGSCV. Subsequently, the analysis moves on to an advanced optimization meta-heuristic optimized technique known as Particle Swarm Optimization (PSO). In this approach, the GlobalBestPSO method is implemented to tune the classifiers. The performance of the optimized classifiers such as GlobalBestPSO-SVM (gbestPSO-SVM), GlobalBestPSO-KNN (gbestPSO-KNN), GlobalBestPSO-DT (gbestPSO-DT), and GlobalBestPSO-RF (gbestPSO-RF) classifier are evaluated by analyzing the chosen set of parameters. The comparison of test results demonstrates the outstanding performance metrics in the optimized method, with accuracy outperformed with exceeding 0.95 score. The proposed hybrid model, integrating GlobalBestPSO with basic classifiers, superiors both traditional classifiers and tuned model HGSCV. The analysis is concluded to figure out the performance metrics of boosted classifiers, which optimized with the GlobalBestPSO, offers superior performance than others beyond all metrics.
Veeralagan J, Manju Priya S. Hyper tuning using GridSearchCV on machine learning models for prognosticating dementia.2022 Dec 8.
2.
Etaiwi W, Biltawi M, Naymat G. Evaluation of classification algorithms for banking customer’s behavior under Apache Spark Data Processing System. Procedia computer science. 2017 Jan 1; 113:559-64.
3.
Sarker IH. Machine learning: Algorithms, real-world applications and research directions. SN computer science. 2021 May;2(3):160.
4.
Smeureanu I, Ruxanda G, Badea LM. Customer segmentation in private banking sector using machine learning techniques. Journal of Business Economics and Management. 2013 Nov 1;14(5):923-39.
5.
Zeinulla E, Bekbayeva K, Yazici A. Comparative study of the classification models for prediction of bank telemarketing. In2018 IEEE 12th International Conference on Application of Information and Communication Technologies (AICT) 2018 Oct 17 (pp. 1-5). IEEE.
6.
Dawood EA, Elfakhrany E, Maghraby FA. Improve profiling bank customer’s behavior using machine learning. Ieee Access. 2019 Aug 12; 7:109320-7.
7.
Charbuty B, Abdulazeez A. Classification Based on Decision Tree Algorithm for Machine Learning. JASTT. 2021 Mar. 24 [cited 2026 Feb. 10];2(01):20-8.
8.
Gupta G. A self-explanatory review of decision tree classifiers. InInternational conference on recent advances and innovations in engineering (ICRAIE-2014) 2014 May 9 (pp. 1-7). IEEE.
9.
Patel HH, Prajapati P. Study and analysis of decision tree-based classification algorithms. International Journal of Computer Sciences and Engineering. 2018 Oct 31;6(10):74-8.
10.
Chicho BT, Abdulazeez AM, Zeebaree DQ, Zebari DA. Machine learning classifiers-based classification for IRIS recognition. Qubahan Academic Journal. 2021 May 4;1(2):106-18.
11.
Khorshid SF, Abdulazeez AM. Breast cancer diagnosis based on k-nearest neighbors: a review. PalArch’s Journal of Archaeology of Egypt/Egyptology. 2021 Feb;18(4):1927-51.
12.
Zebari DA, Zeebaree DQ, Abdulazeez AM, Haron H, Hamed HN. Improved threshold based and trainable fully automated segmentation for breast cancer boundary and pectoral muscle in mammogram images. Ieee Access. 2020 Nov 5;8:203097-116.
13.
Ahmed NS, Sadiq MH. Clarify of the random forest algorithm in an educational field. In2018 international conference on advanced science and engineering (ICOASE) 2018 Oct 9 (pp. 179-184). IEEE.
14.
Priyanka, Kumar D. Decision tree classifier: a detailed survey. International Journal of Information and Decision Sciences. 2020;12(3):246-69.
15.
Pandey AK, Singh P. A systematic survey of classification algorithms for cancer detection. Int. J. Data Informatics Intell. Comput. 2022 Dec 21;1(2):34-50.
16.
Wang S, Lu H, Khan A, Hajati F, Khushi M, Uddin S. A machine learning software tool for multiclass classification. Software Impacts. 2022 Aug 1; 13:100383.
17.
Belete DM, Huchaiah MD. Grid search in hyperparameter optimization of machine learning models for prediction of HIV/AIDS test results. International Journal of Computers and Applications. 2022 Sep 2;44(9):875-86.
18.
Religia YR, Pranoto GT, Suwancita IM. Analysis of the use of particle swarm optimization on naïve bayes for classification of credit bank applications. JISA (Jurnal Informatika dan Sains). 2021 Dec 26;4(2):133- 7.
19.
Chopard B, Tomassini M. Particle swarm optimization. InAn introduction to metaheuristics for optimization 2018 Nov 3 (pp. 97-102). Cham: Springer International Publishing.
20.
Vardhini KK, Sitamahalakshmi T. A review on nature-based swarm intelligence optimization techniques and its current research directions. Indian Journal of Science and Technology. 2016 Mar 16;9(10):1-3.
21.
Li J, Ding L, Li B. A novel naive bayes classification algorithm based on particle swarm optimization. The Open Automation and Control Systems Journal. 2014 Dec;6(1):747-53.
22.
Lamba A, Kumar D. Survey on KNN and its variants. Int. J. Adv. Res. Comput. Commun. Eng. 2016 May;5(5):430-5.
23.
Ibrahim I, Abdulazeez A. The role of machine learning algorithms for diagnosing diseases. Journal of Applied Science and Technology Trends. 2021 Mar 19;2(01):10-9.
24.
Ray S. A quick review of machine learning algorithms. In2019 International conference on machine learning, big data, cloud and parallel computing (COMITCon) 2019 Feb 14 (pp. 35-39). IEEE.
25.
Almasi ON, Rouhani M. Fast and de-noise support vector machine training method based on fuzzy clustering method for large real-world datasets. Turkish Journal of Electrical Engineering and Computer Sciences. 2016;24(1):219-33.
26.
Cheushev V, Simovici DA, Shmerko V, Yanushkevich S. Functional entropy and decision trees. InProceedings. 1998 28th IEEE International Symposium on Multiple-Valued Logic (Cat. No. 98CB36138) 1998 May 29 (pp. 257-262).
27.
Molala R. Entropy, Information Gain, Gini Index—The Crux of a Decision Tree. Medium. 2020 Mar.
28.
Sarica A, Cerasa A, Quattrone A. Random Forest algorithm for the classification of neuroimaging data in Alzheimer’s disease: a systematic review. Frontiers in aging neuroscience.
29.
DeCastro-García N, Munoz Castaneda AL, Escudero Garcia D, Carriegos MV. Effect of the sampling of a dataset in the hyperparameter optimization phase over the efficiency of a machine learning algorithm. Complexity. 2019;2019(1):6278908.
30.
Eberhart R, Kennedy J. A new optimizer using particle swarm theory. InMHS’95. Proceedings of the sixth international symposium on micro machine and human science 1995 Oct 4 (pp. 39-43). Ieee.
31.
Kennedy J, Eberhart R. Particle swarm optimization. InProceedings of ICNN’95-international conference on neural networks 1995 Nov 27 (Vol. 4, pp. 1942-1948). ieee.
The statements, opinions and data contained in the journal are solely those of the individual authors and contributors and not of the publisher and the editor(s). We stay neutral with regard to jurisdictional claims in published maps and institutional affiliations.