Enhanced grey wolf optimisation algorithm for feature selection in anomaly detection

40  Download (0)

Full text


The copyright © of this thesis belongs to its rightful author and/or other copyright owner. Copies can be accessed and downloaded for non-commercial or learning purposes without any charge and permission. The thesis cannot be reproduced or quoted as a whole without the permission from its rightful owner. No alteration or changes in format is allowed without permission from its rightful owner.








Permission to Use

In presenting this thesis in fulfilment of the requirements for a postgraduate degree from Universiti Utara Malaysia, I agree that the University Library may make it freely available for inspection. I further agree that permission for the copying of this thesis in any manner, in whole or in part, for a scholarly purpose may be granted by my supervisor(s) or, in their absence, by the Dean of Awang Had Salleh Graduate School of Arts and Sciences. It is understood that any copying or publication or use of this thesis or parts thereof for financial gain is not allowed without my written permission. It is also understood that due to recognition shall be given to me and to Universiti Utara Malaysia for any scholarly use, which may be made of any material from my thesis.

Requests for permission to copy or to make other uses of materials in this thesis, in whole or in part, should be addressed to:

Dean of Awang Had Salleh Graduate School of Arts and Sciences UUM College of Arts and Sciences

Universiti Utara Malaysia 06010 UUM Sintok




Pengesanan anomali berkaitan dengan pengenalpastian item yang tidak mematuhi corak yang dijangkakan atau item yang terdapat dalam set data. Prestasi mekanisme berbeza yang digunakan untuk melakukan pengesanan anomali sangat bergantung pada kumpulan ciri yang digunakan. Oleh itu, bukan semua ciri dalam set data boleh digunakan dalam proses pengelasan kerana sesetengah ciri boleh membawa kepada prestasi pengelas yang rendah.

Pemilihan ciri (FS) ialah mekanisme yang baik yang meminimumkan dimensi set data dimensi tinggi dengan memadamkan ciri yang tidak berkaitan. Pengoptimum Serigala Kelabu Binari yang Diubahsuai (MBGWO) ialah algoritma metaheuristik moden yang telah berjaya digunakan untuk FS untuk pengesanan anomali. Walau bagaimanapun, MBGWO mempunyai beberapa isu dalam mencari penyelesaian yang berkualiti. Oleh itu, kajian ini mencadangkan algoritma pengoptimum serigala kelabu binari (EBGWO) yang dipertingkatkan untuk FS dalam pengesanan anomali untuk mengatasi isu algoritma. Pengubahsuaian pertama meningkatkan populasi awal MBGWO menggunakan algoritma Pengoptimuman Koloni Semut berasaskan heuristik. Pengubahsuaian kedua membangunkan mekanisme kemas kini kedudukan baharu menggunakan pergerakan Algoritma Bat. Pengubahsuaian ketiga menambah baik parameter terkawal algoritma MBGWO menggunakan penunjuk daripada proses carian untuk memperhalusi penyelesaian. Algoritma EBGWO telah dinilai pada NSL- KDD dan enam (6) set data penanda aras daripada repositori University California Irvine (UCI) terhadap sepuluh (10) algoritma metaeuristik penanda aras. Keputusan percubaan algoritma EBGWO pada set data NSL-KDD dari segi bilangan ciri yang dipilih dan ketepatan klasifikasi adalah lebih baik daripada algoritma pengoptimuman penanda aras yang lain.

Selain itu, eksperimen ke atas enam (6) set data UCI menunjukkan bahawa algoritma EBGWO lebih unggul daripada algoritma penanda aras dari segi ketepatan klasifikasi dan kedua terbaik untuk bilangan ciri yang dipilih. Algoritma EBGWO yang dicadangkan boleh digunakan untuk FS dalam tugas pengesanan anomali yang melibatkan sebarang saiz set data daripada pelbagai domain aplikasi.

Kata kunci: Metaheuristik, Grey wolf optimiser, Pemilihan ciri, Pengelasan, Pengesanan Anomali




Anomaly detection deals with identification of items that do not conform to an expected pattern or items present in a dataset. The performance of different mechanisms utilized to perform the anomaly detection depends heavily on the group of features used. Thus, not all features in the dataset can be used in the classification process since some features may lead to low performance of classifier. Feature selection (FS) is a good mechanism that minimises the dimension of high-dimensional datasets by deleting the irrelevant features. Modified Binary Grey Wolf Optimiser (MBGWO) is a modern metaheuristic algorithm that has successfully been used for FS for anomaly detection. However, the MBGWO has several issues in finding a good quality solution. Thus, this study proposes an enhanced binary grey wolf optimiser (EBGWO) algorithm for FS in anomaly detection to overcome the algorithm issues. The first modification enhances the initial population of the MBGWO using a heuristic based Ant Colony Optimisation algorithm. The second modification develops a new position update mechanism using the Bat Algorithm movement. The third modification improves the controlled parameter of the MBGWO algorithm using indicators from the search process to refine the solution. The EBGWO algorithm was evaluated on NSL-KDD and six (6) benchmark datasets from the University California Irvine (UCI) repository against ten (10) benchmark metaheuristic algorithms. Experimental results of the EBGWO algorithm on the NSL-KDD dataset in terms of number of selected features and classification accuracy are superior to other benchmark optimisation algorithms. Moreover, experiments on the six (6) UCI datasets showed that the EBGWO algorithm is superior to the benchmark algorithms in terms of classification accuracy and second best for the number of selected features. The proposed EBGWO algorithm can be used for FS in anomaly detection tasks that involve any dataset size from various application domains.

Keywords: Metaheuristic, Grey wolf optimiser, Feature selection, Classification, Anomaly detection




“In the name of Allah, the Most Gracious, the Most Merciful”

“All praises and thanks to the Almighty, Allah (SWT), who help me to finish this study by giving me the opportunity, determination and strength to do my research.

There are several people who made this thesis possible and to whom i wish to express my gratitude. Firstly, I would like to particularly thank my supervisor Prof. Dr. Ku Ruhana Ku Mahamud, for her care, continuous support, encouragement, patience, generosity, and advice during my study at University Utara Malaysia (UUM).

Secondly, to my parents, Fouad Almazini and Kadhimiyah, my wonderful brothers, Aymen and Hassan, my lovely sister, Ayat, and my relatives in Iraq. It would not be possible for me to complete the study and this dissertation without the supporting and encourage from you all.

I am forever indebted to you all. Your unconditional and selfless support have been the foundation of my life.

Thirdly, a part of this project would not exist without the help of my friends as they have given me encouragement and strength. Thank you Dr Rafid Sagban from the University of Babylon, Dr.Hayder Naser and Dr.Ayad Mohammed from the Shatt Al-Arab University College. I would like also to thank my great friends Salah Mortada, Harith, Ammar, Jawad, and Hassan thanks for standing beside me and giving me the support in all periods of study.

Finally, my gratitude is extended to my colleagues and my teacher’s in the Ministry of Higher Education and especially Shatt Al-Arab University College and University Utara Malaysia for the innumerable discussions, suggestions, critics, and helps during my study that create a stimulating and enjoyable atmosphere to work in.

Hussein Fouad Abbas Almazini - 2022



Table of Contents

Permission to Use ... i

Abstrak ... ii

Abstract ... iii

Acknowledgement ... iv

Table of Contents ... v

List of Tables ... viii

List of Figures ... xi

List of Abbreviations ... xiii


1.1 Problem Statement ... 8

1.2 Research Questions ... 10

1.3 Research Objectives ... 11

1.4 Significance of the Study ... 12

1.5 Scope of the Study ... 12

1.6 Thesis Organisation ... 13


2.1 Anomaly Detection ... 15

2.1.1 Anomaly Detection by Machine Learning ... 19

2.1.2 Feature Selection for Anomaly Detection ... 25

2.2 Feature Selection ... 25

2.2.1 Wrapper Feature Selection ... 28

2.2.2 Filter Feature Selection ... 29

2.2.3 Embedded Feature Selection ... 30

2.2.4 Discussion ... 30

2.3 Bio-inspired Optimisation Algorithms for Feature Selection ... 32

2.4 Grey Wolf Optimiser... 39

2.5 Initial Population ... 43

2.6 Position Update Mechanism ... 49

2.7 Controlled Parameter ... 57



2.8 Summary ... 66


3.1 Research Framework ... 67

3.1.1 Initial Population ... 69

3.1.2 Position Update Mechanism ... 72

3.1.3 Controlled Parameter ... 75

3.2 Performance Evaluation ... 78

3.2.1 Evaluation Dataset ... 79

3.2.2 Comparison Phases ... 83

3.2.3 Evaluation Metric ... 86

3.3 Summary ... 87


4.1 Introduction ... 89

4.2 ACO Heuristic-based Initial Population Mechanism ... 90

4.3 Position Update Mechanism ... 95

4.4 Calculation method of Controlled Parameter ... 102

4.5 Enhanced Binary Grey Wolf Optimisation Algorithm ... 105

4.6 Summary ... 109


5.1 Introduction ... 110

5.2 Experimental Design ... 110

5.3 Results and Analysis of Heuristic Initialisation ... 114

5.3.1 Experiment on the Full NSL-KDD Dataset ... 115

5.3.2 Experiment on the Subset NSL-KDD Dataset ... 117

5.4 Results and Analysis of Position Update Mechanism for MBGWO ... 127

5.4.1 Experiment on the Full NSL-KDD Dataset ... 128

5.4.2 Experiment on the Subset NSL-KDD Dataset ... 131

5.5 Results and Analysis of Controlled Parameter... 140

5.5.1 Experiment on the Full NSL-KDD Dataset ... 141



5.5.2 Experiment on the Subset NSL-KDD Dataset ... 143

5.6 Results and Analysis of the EBGWO Algorithm ... 156

5.6.1 Performance of EBGWO on the Full NSL-KDD Dataset ... 157

5.6.2 Performance of EBGWO on Subset NSL-KDD Dataset ... 159

5.6.3 Performance of EBGWO on Benchmark Datasets ... 174

5.7 Summary ... 179


6.2 Research Contribution ... 181

6.2.1 Knowledge Contribution ... 181

6.2.2 Practical Contribution ... 183

6.3 Limitation and Future Work ... 183




List of Tables

Table 2.1 Advantages and disadvantages of Filter, Wrapper, and Embedded approaches .... 31

Table 2.2 Swarm intelligence algorithms for FS ... 38

Table 2.3 Schematic description of literature on initial population initialisation ... 48

Table 2.4 Schematic description of literature on position update mechanism ... 56

Table 2.5 Schematic description of literature on parameter controlling ... 62

Table 2.6 GWO in different applications domain ... 63

Table 3.1 Distribution of attack types in the NSL-KDD ... 80

Table 3.2 The features of NSL-KDD dataset ... 81

Table 3.3 Main dataset details to be applied in the experiment ... 83

Table 3.4 Confusion matrices ... 87

Table 5.1 Heuristic initialisation parameters ... 114

Table 5.2 Average classification accuracy using full NSL-KDD dataset ... 115

Table 5.3 Average number of selected features using full NSL-KDD dataset ... 116

Table 5.4 Average classification accuracy using SVM classifier on NSL-KDD subset ... 117

Table 5.5 Average number of selected features using SVM classifier on NSL-KDD subset ... 118

Table 5.6 Performance rank on NSL-KDD subset with SVM classifier ... 119

Table 5.7 Average classification accuracy using KNN classifier on NSL-KDD subset ... 120

Table 5.8 Average number of selected features using KNN classifier on NSL-KDD subset ... 121

Table 5.9 Performance rank on NSL-KDD subset with KNN classifier... 122

Table 5.10 Average classification accuracy using DT classifier on NSL-KDD subset ... 124

Table 5.11 Average number of selected features utilising DT classifier on NSL-KDD subset ... 124

Table 5.12 Performance rank on NSL-KDD subset with DT classifier ... 125

Table 5.13 Best performance of heuristic MBGWO for the initial population mechanism. 127 Table 5.14 Experimental parameters... 128

Table 5.15 Average classification accuracy for full NSL-KDD dataset ... 129

Table 5.16 Average number of selected features selected using full NSL-KDD dataset .... 129

Table 5.17 Average classification accuracy using SVM classifier on NSL-KDD subset .... 131

Table 5.18 Number of selected features using SVM classifier on NSL-KDD subset ... 132

Table 5.19 Performance rank on NSL-KDD subset with SVM classifier... 132



Table 5.20 Average classification accuracy using KNN classifier on NSL-KDD subset .... 134

Table 5.21 Average number of selected features using KNN classifier on NSL-KDD subset ... 134

Table 5.22 Performance rank on NSL-KDD subset with KNN classifier... 135

Table 5.23 Average classification accuracy using DT classifier on NSL-KDD subset ... 137

Table 5.24 Average number of selected features using DT classifier on NSL-KDD subset 137 Table 5.25 Performance rank on NSL-KDD subset with DT classifier ... 138

Table 5.26 Best performance of RRMBGWO ... 140

Table 5.27 Experimental parameters... 141

Table 5.28 Average classification accuracy using full NSL-KDD dataset ... 142

Table 5.29 Average number of selected features selected using full NSL-KDD dataset .... 142

Table 5.30 Average classification accuracy using SVM classifier on NSL-KDD subset .... 143

Table 5.31 Average number of selected features using SVM classifier on NSL-KDD subset ... 144

Table 5.32 Performance rank on NSL-KDD subset with SVM classifier... 145

Table 5.33 Average classification accuracy using KNN classifier on NSL-KDD subset .... 146

Table 5.34 Average number of selected features using KNN classifier on NSL-KDD subset ... 147

Table 5.35 Performance rank on NSL-KDD subset with KNN classifier... 148

Table 5.36 Average classification accuracy using DT classifier on NSL-KDD subset ... 149

Table 5.37 Average number of selected features using DT classifier on NSL-KDD subset 150 Table 5.38 Performance rank on NSL-KDD subset with DT classifier ... 151

Table 5.39 Best performance of CMBGWO ... 156

Table 5.40 Results for full NSL-KDD dataset ... 158

Table 5.41 Average classification accuracy using SVM classifier on NSL-KDD subset .... 160

Table 5.42 Average number of selected features using SVM classifier on NSL-KDD subset ... 161

Table 5.43 Performance rank on NSL-KDD subset with SVM classifier... 162

Table 5.44 Average classification accuracy using KNN classifier for NSL-KDD subset ... 165

Table 5.45 Average number of selected features using KNN classifier for NSL-KDD subset ... 166

Table 5.46 Performance rank on NSL-KDD subset using KNN classifier ... 167

Table 5.47 Average classification accuracy using DT classifier for NSL-KDD subset ... 169 Table 5.48 Average number of selected features using DT classifier for NSL-KDD subset170



Table 5.49 Performance rank on NSL-KDD subset with KNN classifier... 171 Table 5.50 Best performance of EBGWO ... 173 Table 5.51 Test results of proposed EBGWO and benchmark algorithms based on average classification accuracy on different datasets size using KNN classifier ... 175 Table 5.52 Test results of proposed EBGWO and other algorithms based on average number of features selected on different datasets size using KNN classifier... 176 Table 5.53 Performance rank on different datasets with KNN classifier ... 177



List of Figures

Figure 1.1. Studies on FS using the GWO ... 7

Figure 2.1 Anomaly detection nomenclature (Hayes & Capretz, 2014) ... 17

Figure 2.2. Anomaly types (Hayes & Capretz, 2014) ... 18

Figure 2.3 Anomaly detection types of definition ... 19

Figure 2.4. Knowledge discovery process (Shen, 2005) ... 25

Figure 2.5. General flowchart of FS ... 27

Figure 2.6. General approaches for feature selection ... 28

Figure 2.7. FS Methods (a) Filter; (b) Wrapper; and, (c) Embedded ... 31

Figure 2.8. Grey wolf social structure ... 41

Figure 2.9. Representation of the solution in FS (Mirjalili et al., 2019) ... 42

Figure 3.1. Research framework ... 68

Figure 3.2. Flowchart of MBGWO with proposed modifications ... 69

Figure 3.3. ACO for heuristics-based initial population ... 70

Figure 3.4. New position update mechanism ... 73

Figure 3.5. Proposed method for controlled parameter value ... 76

Figure 3.6 Attack types definition... 80

Figure 3.7. NSL-KDD Dataset Class Distribution ... 80

Figure 4.1. Algorithm of the heuristic initial population mechanism ... 90

Figure 4.2. Random initial population procedure for the MBGWO ... 91

Figure 4.3. ACO heuristic initialisation ... 93

Figure 4.4. Algorithm of the proposed position update mechanism ... 96

Figure 4.5. Position update mechanism of GWO ... 98

Figure 4.6. Proposed position update mechanism ... 99

Figure 4.7. Reposition operation for the MBGWO ... 100

Figure 4.8. Reinitialisation operation for the MBGWO ... 101

Figure 4.9. Parameter control ... 103

Figure 4.10. Controlling the exploration and exploitation parameter of the MBGWO ... 105

Figure 4.11. Flowcharts of the proposed EBGWO and MBGWO algorithms ... 106

Figure 4.12. EBGWO algorithm for feature selection ... 108

Figure 5.1. Experimental design framework ... 111



Figure 5.2. Population initialisation for the heuristic initialisation and random initialisation

... 116

Figure 5.3. Performance rank plot using SVM classifier on NSL-KDD subset ... 120

Figure 5.4. Performance rank for KNN classifier on NSL-KDD subset... 123

Figure 5.5. Performance rank plot using DT classifier on NSL-KDD subset ... 126

Figure 5.6. Solution diversity of the MBGWO and RRMBGWO on NSL-KDD dataset ... 130

Figure 5.7. Performance rank plot using SVM classifier on NSL-KDD subset ... 133

Figure 5.8. Performance rank plot using KNN classifier on NSL-KDD subset ... 136

Figure 5.9. Performance rank plot using DT classifier on NSL-KDD subset ... 139

Figure 5.10. Performance rank plot using SVM classifier on NSL-KDD subset ... 146

Figure 5.11. Performance rank plot using KNN classifier on NSL-KDD subset ... 149

Figure 5.12. Performance rank plot using DT classifier on NSL-KDD subset ... 152

Figure 5.13. Convergence and diversity of MBGWO and CMBGWO for Normal class. ... 153

Figure 5.14. Convergence and diversity of MBGWO and CMBGWO for DoS class. ... 154

Figure 5.15. Convergence and diversity of MBGWO and CMBGWO for Probe class. ... 154

Figure 5.16. Convergence and diversity of MBGWO and CMBGWO for U2R class. ... 155

Figure 5.17. Convergence and diversity of MBGWO and CMBGWO for R2L class. ... 155

Figure 5.18. Performance rank plot using SVM classifier on NSL-KDD subset ... 163

Figure 5.19. Performance rank plot using KNN classifier on NSL-KDD subset ... 168

Figure 5.20. Performance rank for DT classifier on NSL-KDD subset ... 172

Figure 5.21. Performance rank plot using KNN classifier on different datasets ... 178



List of Abbreviations

ABC Artificial Bee Colony

ACO Ant Colony Optimisation

bGWO Binary Grey Wolf Optimize

BPSO Binary Particle Swarm Optimisation

BA Bat Algorithm

EPD Evolutionary Population Dynamics

EBGWO Enhanced Binary Grey Wolf Optimizer

FS Feature Selection

F-Score Fisher Score

GWO Grey Wolf Optimizer

GA Genetic Algorithm

HGGWA Hybrid Genetic Grey wolf

HMOGWO Hybrid Multi-Objective Grey Wolf Optimizer

IGWO Improved Grey Wolf Optimizer

KNN K-Nearest Neighbour

MBGWO Modified Binary Grey Wolf Optimizer

MGWO Modify Grey Wolf Optimisation

NEH Nawaz Enscore Ham

PSO Particle Swarm Optimisation

SVM Support Vector Machines

SA Simulated Annealing

WOA Whale Optimisation Algorithm

BDA Binary Dragon Algorithm

Binary Harris Hawk Optimization BHHO




Advancements in technology have produced in the generation of a considerable amount of data. There is an augmentation in the possibility of these data being tampered with. These data, which can be financial, environmental, social or medical, have different importance and frameworks. It is, therefore, necessary to consider how to protect the data and the devices that are used to store them.

According to Alexander et al. (2014) and Namasudra et al. (2020), data security is an essential constituent of computer science which is involved in securing information systems from scammers/attackers who plan to break down the privacy, integrity, availability, and system services of the data. Protection of such information systems can be through logical and physical controls. Physical controls such as iris and fingerprint scanners are applied to avoid the scammers/attackers from accessing the electronic devices. The logical aspect consists of software defence which can be applied to safeguard the information system and data. Logical controls consist of anomaly detection, prevention systems, passwords, and access control.

Anomaly detection entails the identification of situations that come from an extraordinary distribution or classification than the majority. Example of this kind of task is fraud detection (e.g., identify fraudulent among the majority of valid credit card transactions) and intrusion detection.




Abadeh, M. S., Habibi, J., & Lucas, C. (2007). Intrusion detection using a fuzzy genetics-based learning algorithm. Journal of Network and Computer Applications, 30(1), 414–428. https://doi.org/10.1016/j.jnca.2005.05.002 Agarwal, S. (2014). Data mining concepts and techniques. In Proceedings - 2013

International Conference on Machine Intelligence Research and Advancement, ICMIRA 2013 (Third). Morgan Kaufmann.


Agrawal, V., & Chandra, S. (2015). Feature selection using Artificial Bee Colony algorithm for medical image classification. Proc. of International Conf.on Contemporary Computing, 171–176. https://doi.org/10.1109/IC3.2015.7346674 Ahmed, S., Ait, T., Benyettou, A., & Ouali, M. (2017). Kernel-based learning and

feature selection analysis for cancer diagnosis. Applied Soft Computing Journal, 51, 39–48. https://doi.org/10.1016/j.asoc.2016.12.010

Al-behadili, H. N. K., Ku-Mahamud, K. R., & Sagban, R. (2020a). Hybrid Ant Colony Optimization and Genetic Algorithm for Rule Induction. Journal of Computer Science, 16(7), 1019–1028.


Al-behadili, H. N. K., Ku-Mahamud, & Sagban, R. (2020b). Adaptive Parameter Control Strategy for Ant-Miner Classification Algorithm. Indonesian Journal of Electrical Engineering and Informatics (IJEEI), 8(1), 149–162.


Al-tashi, Q., Abdulkadir, S. J., Rais, H., & Mirjalili, S. (2019). Binary Optimization Using Hybrid Grey Wolf Optimization for Feature Selection. IEEE Access, 7, 1–9. https://doi.org/10.1109/ACCESS.2019.2906757

Alelyani, S. (2013). On Feature Selection Stability: A Data Perspective (Issue May).

Arizona State University.

https://search.proquest.com/docview/1350624904?pq-origsite=gscholar Alexander, D., Finch, A., & Sutton, D. (2014). Information Security Management

Principles (2nd Eds.). BCS Learning & Development Limited.

Alghuried, A. (2017). A Model for Anomalies Detection in Internet of Things ( IoT ) Using Inverse Weight Clustering and Decision Tree [Dublin Institute of

Technology]. https://doi.org/10.21427/D7WK7S



Almasoudy, F. H., Al-yaseen, W. L., & Idrees, A. K. (2020). Differential Evolution Wrapper Feature Selection for Intrusion Differential Evolution Wrapper Feature Selection for Intrusion Detection System Detection System. Procedia Computer Science, 167(2019), 1230–1239. https://doi.org/10.1016/j.procs.2020.03.438

Alpaydin, E. (2020). Introduction to machine learning. The MIT Press.

Alzubi, Q. M., Anbar, M., Alqattan, Z. N. M., & Azmi, M. (2020). Intrusion detection system based on a modified binary grey wolf optimisation. Neural Computing and Applications, 1–13. https://doi.org/10.1007/s00521-019-04103- 1

Amiri, F., Rezaei Yousefi, M., Lucas, C., Shakery, A., & Yazdani, N. (2011).

Mutual information-based feature selection for intrusion detection systems.

Journal of Network and Computer Applications, 34(4), 1184–1199.


André, J., & Sargo, G. (2013). Binary Fish School Search applied to Feature Selection. Tecnico Lisboa.

Andrew, N. (2000). CS229 Lecture notes Margins : Intuition. Intelligent Systems and Their Applications IEEE, pt.1(x), 1–25.


Arora, S., Singh, H., Sharma, M., Sharma, S., & Anand, P. (2019). A New Hybrid Algorithm Based on Grey Wolf Optimization and Crow Search Algorithm for Unconstrained Function Optimization and Feature Selection. IEEE Access, 7, 26343–26361. https://doi.org/10.1109/ACCESS.2019.2897325

Bache, K., & Lichman, M. (2013). UCI Machine Learning Repository. University of California Irvine School of Information, 2008(14/8), 0.

https://doi.org/University of California, Irvine, School of Information and Computer Sciences

Bartz-Beielstein, T., Branke, J., Mehnen, J., & Mersmann, O. (2014). Evolutionary Algorithms. Wiley Interdisciplinary Reviews, 4(3), 178–195.


Baştanlar, Y., & Ozuysal, M. (2014). Introduction to Machine Learning. Methods in Molecular Biology (Clifton, N.J.), 1107, 105–128. https://doi.org/10.1007/978- 1-62703-748-8_7



Beg, A. H., & Islam, M. Z. (2016). Advantages and limitations of genetic algorithms for clustering records. Proceedings of the 2016 IEEE 11th Conference on Industrial Electronics and Applications, ICIEA 2016, 2478–2483.


Bengio, Y., Lodi, A., & Prouvost, A. (2020). Machine learning for combinatorial optimization : A methodological tour d ’ horizon. European Journal of Operational Research, xxxx, 1–17. https://doi.org/10.1016/j.ejor.2020.07.063 Bento, D., Pinho, D., Pereira, A. I., & Lima, R. (2013). Genetic algorithm and

particle swarm optimization combined with Powell method. AIP Conference Proceedings, 1558, 578–581. https://doi.org/10.1063/1.4825557

Bergman, L., & Hoshen, Y. (2020). Classification-based anomaly detection for general data. ArXiv Preprint ArXiv:2005.02359.

Bhattacharyya, D. K., & Kalita, J. K. (2013). Network anomaly detection: A machine learning perspective. In Network Anomaly Detection: A Machine Learning Perspective (Issue November). Crc Press.

Biermann, E., Cloete, E., & Venter, L. M. (2001). A comparison of intrusion detection systems. Computers and Security, 20(8), 676–683.


Blum, C., & Li, X. (2008). Swarm Intelligence in Optimization. Swarm Intelligence, 43–85. https://doi.org/10.1007/978-3-540-74089-6_2

Boulle, M. (2004). Khiops: A Statistical Discretization Method of Continuous Attributes. Machine Learning, 55(1), 53–69.


Bouzary, H., Chen, F. F., & Shahin, M. (2019). Optimal composition of tasks in cloud manufacturing platform: A novel hybrid GWO-GA approach. Procedia Manufacturing, 34, 961–968. https://doi.org/10.1016/j.promfg.2019.06.103 Brent, R. P. (2013). Algorithms for minimization without derivatives. Courier


Buczak, A. L., & Guven, E. (2016). A Survey of Data Mining and Machine Learning Methods for Cyber Security Intrusion Detection. IEEE Communications

Surveys and Tutorials, 18(2), 1153–1176.




Cannady, J., & Harrell, J. (1996). A Comparative Analysis of Current Intrusion Detection Technologies. Pattern Recognition, 96, 212–218.


Carbonell, J. G., Michalski, R. S., & Mitchell, T. M. (1983). Machine Learning: A Historical and Methodological Analysis. AI Magazine, 4(3), 69.


Casas, P., Mazel, J., & Owezarski, P. (2012). Unsupervised Network Intrusion Detection Systems: Detecting the Unknown without Knowledge. Computer Communications, 35(7), 772–783.


Castillo, O., Neyoy, H., Soria, J., García, M., & Valdez, F. (2013). Dynamic fuzzy logic parameter tuning for ACO and its application in the fuzzy logic control of an autonomous mobile robot. International Journal of Advanced Robotic Systems, 10(May 2014). https://doi.org/10.5772/54833

Chandra, M., Agrawal, A., Kishor, A., & Niyogi, R. (2016). Web Service Selection with Global Constraints using Modified Gray Wolf Optimizer. Proc. of

International Conf. on Advances in Computing, Communications and Informatics, 1989–1994.

Chang, P. S. (1997). Microencapsulation and Oxidative Stability of Docosahexaenoic Acid. ACS Symposium Series, 674, 264–273.


Chang, R. S., Chang, J. S., & Lin, P. S. (2009). An ant algorithm for balanced job scheduling in grids. Future Generation Computer Systems, 25(1), 20–27.


Chengyi, T. (2014). Gravitational Search Algorithm Based on Simulated Annealing.

Journal of Convergence Information Technology (JCIT), 9(March), 231–237.

Chuang, L. Y., Yang, C. H., & Li, J. C. (2011). Chaotic maps based on binary particle swarm optimization for feature selection. Applied Soft Computing, 11(1), 239–248. https://doi.org/10.1016/j.asoc.2009.11.014

Crepinsek, M., Liu, S. H., & Mernik, M. (2013). Exploration and exploitation in evolutionary algorithms: A survey. ACM Computing Surveys, 45(3), 1–33.




Dash, M., & Liu, H. (2003). Consistency-based search in feature selection. Artificial Intelligence, 151(1–2), 155–176. https://doi.org/10.1016/S0004-


Demsar, J. (2006). Statistical Comparisons of Classifiers over Multiple Data Sets.

Journal of Machine Learning Research, 7, 1–30.

Deng, G., Xu, Z., & Gu, X. (2012). A discrete artificial bee colony algorithm for minimizing the total flow time in the blocking flow shop scheduling. Chinese Journal of Chemical Engineering, 20(6), 1067–1073.


Diego, S. (1984). Classification Algorithms and Regression Trees. Classification and Regression Trees, 246–280.

Djellali, H., & Guessoum, S. (2017). Fast Correlation based Filter combined with Genetic Algorithm and Particle Swarm on Feature Selection.

Donaldson, R. W. (1967). Approximate Formulas for the Information Transmitted by a Discrete Communication Channel. IEEE Transactions on Information Theory, 13(1), 118–119. https://doi.org/10.1109/TIT.1967.1053945

Dorigo, M., Birattari, M., & Stützle, T. (2006). Ant Colony Optimization: Artificial Ants as a Computational Intelligence Technique. IEEE Computational

Intelligence Magazine, 1(4), 28–39.


Dorigo, M., Gambardella, L. M., Birattari, M., Martinoli, A., Poli, R., & Stützle, T.

(2006). Ant Colony Optimization and Swarm Intelligence: 5th International Workshop, ANTS 2006, Brussels, Belgium, September 4-7, 2006, Proceedings (Vol. 4150). Springer.

Dowlatshahi, M. B., Derhami, V., & Nezamabadi-pour, H. (2017). Ensemble of Filter-Based Rankers to Guide an Epsilon-Greedy Swarm Optimizer for High- Dimensional Feature Subset Selection. Information.


Dreyer, S. (2013). Evolutionary Feature Selection. (Master’s Thesis, Institutt for Datateknikk Og Informasjonsvitenskap)., November, 78.

Du, K. L., & Swamy, M. N. S. (2006). Neural Networks in a Soft Computing Framework. Springer Science & Business Media.



Dua, D., & Karra, T. (2017). UCI Machine Learning Repository. Irvine, CA:

University of California, School of Information and Computer Science.

Duda, R. O., Hart, P. E., & Stork, D. G. (2012). Pattern classification. John Wiley &


Dudani, K., & Chudasama, A. R. (2016). Partial discharge detection in transformer using adaptive grey wolf optimizer based acoustic emission technique. Cogent Engineering, 10(1). https://doi.org/10.1080/23311916.2016.1256083

Eid, H. F., Salama, M. A., Hassanien, A. E., & Kim, T. H. (2011). Bi-layer

behavioral-based feature selection approach for network intrusion classification.

Proc. of International Conf. on Privacy, Security and Trust on Security Technology, 259, 195–203. https://doi.org/10.1007/978-3-642-27189-2_21 Ektefa, M., Memar, S., Sidi, F., & Affendey, L. S. (2010). Intrusion detection using

data mining techniques. Proceedings - 2010 International Conference on Information Retrieval and Knowledge Management: Exploring the Invisible World, CAMP’10, 200–203. https://doi.org/10.1109/INFRKM.2010.5466919 El-Hasnony, I. M., Barakat, S. I., Elhoseny, M., & Mostafa, R. R. (2020). Improved

Feature Selection Model for Big Data Analytics. IEEE Access, 8, 66989–67004.


El-Kenawy, E. S. M., Eid, M. M., Saber, M., & Ibrahim, A. (2020). MbGWO-SFS:

Modified Binary Grey Wolf Optimizer Based on Stochastic Fractal Search for Feature Selection. IEEE Access, 8, 107635–107649.


Emary, E. (2015). Binary Gray Wolf Optimization Approaches for Feature Selection. Neurocomputing, 172(8), 371–381.


Emary, E., Zawbaa, H. M., & Grosan, C. (2017). Experienced Gray Wolf

Optimization Through Reinforcement Learning and Neural Networks. IEEE Transactions on Neural Networks and Learning Systems, 29(3), 681–694.


Emary, E., Zawbaa, H. M., & Hassanien, A. E. (2016). Binary ant lion approaches for feature selection. Neurocomputing, 213, 54–65.




Enache, A. C., Sgarciu, V., & Petrescu-Nita, A. (2015). Intelligent feature selection method rooted in Binary Bat Algorithm for intrusion detection. SACI 2015 - 10th Jubilee IEEE International Symposium on Applied Computational Intelligence and Informatics, Proceedings, 517–521.


Engen, V., Vincent, J., & Phalp, K. (2011). Exploring discrepancies in findings obtained with the KDD Cup ’99 data set. Intelligent Data Analysis, 15(2), 251–

276. https://doi.org/10.3233/IDA-2010-0466

Falaghi, H., & Haghifam, M. R. (2007). ACO based algorithm for distributed generation sources allocation and sizing in distribution systems. 2007 IEEE Lausanne POWERTECH, Proceedings, May 2014, 555–560.


Fan, J., & Li, R. (2006). Statistical Challenges with High Dimensionality: Feature Selection in Knowledge Discovery. 1–27. https://doi.org/10.4171/022-3/31 Fan, W., Bouguila, N., & Ziou, D. (2011). Unsupervised anomaly intrusion detection

via localized Bayesian feature selection. Proc. of International Conf. on Data Mining, 1032–1037. https://doi.org/10.1109/ICDM.2011.152

Faris, H., Aljarah, I., Al-betar, M. A., & Mirjalili, S. (2017). Grey wolf optimizer : a review of recent variants and applications. Neural Computing and Applications, November. https://doi.org/10.1007/s00521-017-3272-5

Forsati, R., Moayedikia, A., Jensen, R., Shamsfard, M., & Meybodi, M. R. (2014).

Enriched ant colony optimization and its application in feature selection.

Neurocomputing, 142, 354–371.

Fowler, B. (2000). A sociological analysis of the satanic verses affair. Theory,

Culture and Society, 17(1), 39–61. https://doi.org/10.1177/02632760022050997 Gao, W. F., Liu, S. Y., & Jiang, F. (2011). An improved artificial bee colony

algorithm for directing orbits of chaotic systems. Applied Mathematics and Computation, 218(7), 3868–3879. https://doi.org/10.1016/j.amc.2011.09.034 García-Teodoro, P., Díaz-Verdejo, J., Maciá-Fernández, G., & Vázquez, E. (2009).

Anomaly-based network intrusion detection: Techniques, systems and challenges. Computers and Security, 28(1–2), 18–28.




Ghosh, A., & Nath, B. (2004). Multi-objective rule mining using genetic algorithms.

Information Sciences, 163(1–3), 123–133.


Glover, F., & Laguna, M. (1997). Tabu search principles. Springer, 70–150.

Gou, J., Ma, H., Ou, W., Zeng, S., Rao, Y., & Yang, H. (2019). A generalized mean distance-based k-nearest neighbor classifier. Expert Systems with Applications, 115, 356–372. https://doi.org/10.1016/j.eswa.2018.08.021

Grosan, C., & Abraham, A. (2011). Machine Learning. In Intelligent Systems Reference Library (Vol. 17). https://doi.org/10.1007/978-3-642-21004-4_10 Gu, Q., Li, X., & Jiang, S. (2019). Hybrid Genetic Grey Wolf Algorithm for Large-

Scale Global Optimization. Complexity, 1–18.

Gu, Q., Li, Z., & Han, J. (2011). Generalized fisher score for feature selection.

Proceedings of the 27th Conference on Uncertainty in Artificial Intelligence, UAI 2011, 266–273.

Guangdong, H., Ping, L., & Qun, W. (2007). A Hybrid Metaheuristic ACO-GA with an Application in Sports Competition Scheduling. Eighth ACIS International Conference on Software Engineering, Artificial Intelligence, Networking, and Parallel/Distributed Computing, 1(3), 288–293.


Gupta, P., Jain, S., & Jain, A. (2014). A Review of Fast Clustering-Based Feature Subset Selection Algorithm. International Journal of Scientific & Technology Research, 3(11), 86–91. http://www.ijstr.org/final-print/nov2014/A-Review-Of- Fast-Clustering-based-Feature-Subset-Selection-Algorithm.pdf

Gurav, A., Nair, V., Gupta, U., & Valadi, J. (2015). Glowworm swarm based informative attribute selection using support vector machines for simultaneous feature selection and classification. In International Conference on Swarm, Evolutionary, and Memetic Computing, 3, 27–37. https://doi.org/10.1007/978- 3-319-20294-5

Guyon, I., & Elisseeff, A. (2003). An Introduction to Variable and Feature Selection.

Journal of Machine Learning Research (JMLR), 3(3), 1157–1182.


Hall, M. A., & Holmes, G. (2003). Benchmarking Attribute Selection Techniques for



Discrete Class Data Mining. IEEE Transactions on Knowledge and Data Engineering, 15(6), 1437–1447. https://doi.org/10.1109/TKDE.2003.1245283 Hartmanis, J. (1982). Computers and intractability: a guide to the theory of NP-

completeness (michael r. garey and david s. johnson). Siam Review, 24(1), 90.

Hasani. (2014). Hybrid Feature Selection Algorithm for Intrusion Detection System.

Journal of Computer Science, 10(6), 1015–1025.


Hassanat, A. B., Prasath, V. B. S., Abbadi, M. A., Abu-qdari, S. A., & Faris, H.

(2018). An Improved Genetic Algorithm with a New Initialization Mechanism Based on Regression Techniques. Information 9.7 (2018): 167.


Hassanien, A. E., & Emary, E. (2018). Swarm intelligence: principles, advances, and applications. CRC Press.

Hastie, Trevor, Robert Tibshirani, and J. F. (2009). The elements of statistical learning: data mining, inference, and prediction. Springer Science & Business Media., 26(4), 505–516.

Hayes, M., & Capretz, M. A. (2014). Contextual Anomaly Detection Framework for Big Sensor Data. Journal of Big Data 2.1 (2015): 1-22., May.

He, Q., Hu, X., Ren, H., & Zhang, H. (2015). A novel artificial fish swarm algorithm for solving large-scale reliability-redundancy application problem. ISA

Transactions, 59, 105–113. https://doi.org/10.1016/j.isatra.2015.09.015 Hegazy, A. E., Makhlouf, M. A., & El-tawel, G. S. (2018). Improved salp swarm

algorithm for feature selection. Journal of King Saud University - Computer and Information Sciences. https://doi.org/10.1016/j.jksuci.2018.06.003 Honest, N. (2020). A survey on Feature Selection Techniques. Computers &

Electrical Engineering, September.

Hossain, M. R., Than, A. M., & Ali, A. B. M. S. (2013). The Effectiveness of Feature Selection Method in Solar Power Prediction. Journal of Renewable Energy, 2013(2), 1–9. https://doi.org/10.1155/2013/952613

Hu, B., Dai, Y., Su, Y., Moore, P., Zhang, X., Mao, C., & Chen, J. (2016). Feature



Selection for Optimized High- dimensional Biomedical Data using an Improved Shuffled Frog Leaping Algorithm. 5963(c), 1–10.


Hu, P. I. N., Chen, S., Huang, H., Zhang, G., & Liu, L. (2019). Improved Alpha- Guided Grey Wolf Optimizer. IEEE Access, 7, 5421–5437.


Huang, Y., & Zhao, L. (2018). Review on Landslide Susceptibility Mapping Using Support Vector Machines. CATENA, Elsevier, 165(January), 520–529.


Hussain, F., Hussain, R., Hassan, S. A., & Hossain, E. (2020). Machine Learning in IoT Security : Current Solutions and Future Challenges. c, 1–38.


Imanguliyev, A. (2013). Enhancements for the Bees Algorithm (Issue September).

Cardiff University.

Jayabarathi, T., Raghunathan, T., Adarsh, B. R., & Nagaratnam, P. (2016).

Economic dispatch using hybrid grey wolf optimizer. Energy, 111, 630–641.


Jiang, L., & Wu, J. (2013). Hybrid PSO and GA for neural network evolutionary in monthly rainfall forecasting. Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 7802 LNAI(PART 1), 79–88. https://doi.org/10.1007/978-3- 642-36546-1_9

Kabir, M. M., Shahjahan, M., & Murase, K. (2012). A new hybrid ant colony optimization algorithm for feature selection. Expert Systems with Applications, 39(3), 3747–3763.

Kamboj, V. K. (2015). A novel hybrid PSO – GWO approach for unit commitment problem A novel hybrid PSO – GWO approach for unit commitment problem.

Neural Computing and Applications, June 2015.


Kamikawa, Y., & Kato, T. (2006). Development of liquid-crystalline folate

derivatives: Effects of intermolecular hydrogen bonds at oligopeptide moieties.

Polymer Preprints, Japan, 55(2), 2659–2660.



Kamis, N. N., Embong, A. H., & Ahmad, S. (2019). Optimizing PD-type Fuzzy Logic Controller for Position Control of Spherical Robot. 2019 7th

International Conference on Mechatronics Engineering, ICOM 2019, 1–6.


Karabulut, E. M., Özel, S. A., & İbrikçi, T. (2012). A comparative study on the effect of feature selection on classification accuracy. Procedia Technology, 1, 323–327. https://doi.org/10.1016/j.protcy.2012.02.068

Kashef, S., & Nezamabadi-pour, H. (2014). An Advanced ACO Algorithm for Feature subset Selection. Neurocomputing.


Kaveh, A., & Nasrollahi, A. (2014). Charged system search and particle swarm optimization hybridized for optimal design of engineering structures. Scientia Tranica A, 21(2), 295–305.

Kayacik, H. G., Zincir-Heywood, A. N., & Heywood, M. I. (2005). Selecting features for intrusion detection: A feature relevance analysis on KDD 99 intrusion detection datasets. Proc. of International Conf. on Privacy, Security and Trust, 1723–1730.

Kemmerer, R. A., & Vigna, G. (2002). Intrusion detection: A brief history and overview. Computer, 35(SUPPL.), 27–30.


Khuat, T. T., & Le, M. H. (2018). A Novel Hybrid ABC-PSO Algorithm for Effort Estimation of Software Projects Using Agile Methodologies. Journal of Intelligent Systems, 27(3), 489–506. https://doi.org/10.1515/jisys-2016-0294 Kohavi, R., & Kohavi, R. (1997). Wrappers for feature subset selection. Artificial Intelligence, 97, 273–324. https://doi.org/10.1016/S0004-3702(97)00043-X Kohli, M., & Arora, S. (2017). Chaotic grey wolf optimization algorithm for

constrained optimization problems. Journal of Computational Design and Engineering. https://doi.org/10.1016/j.jcde.2017.02.005

Komaki, G. M., & Kayvanfar, V. (2015). Grey Wolf Optimizer Algorithm for the Two-stage Assembly Flowshop Scheduling Problem with Release Time.

Elsevier B.V. https://doi.org/10.1016/j.jocs.2015.03.011

Kudo, M., & Sklansky, J. (2000). Comparison of algorithms that select features for



pattern classifiers. Pattern Recognition, 33(1), 25–41.


Kumar, V. (2005). Parallel and distributed computing for cybersecurity. IEEE Distributed Systems Online, 6(10), 1–9. https://doi.org/10.1109/MDSO.2005.53 Kumar, V., & Minz, S. (2014). Feature Selection : A literature Review. SmartCR,

4(3). https://doi.org/10.6029/smartcr.2014.03.007

Kunhare, N., Tiwari, R., & Dhar, J. (2020). Particle swarm optimization and feature selection for intrusion detection system. Sadhana, 45(109), 1–14.


Lazarevic, A., Ertoz, L., Kumar, V., Ozgur, A., & Srivastava, J. (2003). A Comparative Study of Anomaly Detection Schemes in Network Intrusion Detection. In Proceedings of the 2003 SIAM International Conference on Data Mining, 25–36. https://doi.org/10.1137/1.9781611972733.3

Lee, S., Soak, S., Oh, S., Pedrycz, W., & Jeon, M. (2008). Modified binary particle swarm optimization. Progress in Natural Science, 18(9), 1161–1166.


Li, H., Qi, F., & Wang, S. (2005). A comparison of model selection methods for multi-class support vector machines. Lecture Notes in Computer Science, 3483(IV), 1140–1148. https://doi.org/10.1007/11424925_119

Li, M., Du, W., & Nian, F. (2014). An adaptive particle swarm optimization algorithm based on directed weighted complex network. Mathematical Problems in Engineering, 2014. https://doi.org/10.1155/2014/434972 Li, Qian, Liu, S. Y., & Yang, X. S. (2020). Influence of initialization on the

performance of metaheuristic optimizers. Applied Soft Computing Journal, 91, 1–39. https://doi.org/10.1016/j.asoc.2020.106193

Li, Qiang, Chen, H., Huang, H., Zhao, X., Cai, Z., Tong, C., Liu, W., & Tian, X.

(2017). An Enhanced Grey Wolf Optimization Based Machine for Medical Diagnosis. Computational and Mathematical Methods in Medicine, 2017.

Li, Y., Xia, J., Zhang, S., Yan, J., Ai, X., & Dai, K. (2012). An efficient intrusion detection system based on support vector machines and gradually feature removal method. Expert Systems with Applications, 39(1), 424–430.




Liang, Z., Sun, J., Lin, Q., Du, Z., Chen, J., & Ming, Z. (2016). A novel multiple rule sets data classification algorithm based on ant colony algorithm. Applied Soft Computing Journal, 38, 1000–1011.


Lin, S., Ying, K., Chen, S., & Lee, Z. (2008). Particle swarm optimization for parameter determination and feature selection of support vector machines.

Expert Systems with Applications, 35, 1817–1824.


Liu, H., & Hiroshi, M. (2012). Feature Selection for Knowledge Discovery and.

Springer Science & Business Media. https://doi.org/10.1007/978-1-4615-5689- 3

Liu, H., Yu, L., & Member, S. (2005). Toward integrating feature selection

algorithms for classification and clustering. Knowledge and Data Engineering, IEEE Transactions On, 17(4), 491–502. https://doi.org/10.1109/TKDE.2005.66 Liu, Y., Lu, H., Cheng, S., & Shi, Y. (2019). An Adaptive Online Parameter Control Algorithm for Particle Swarm Optimization Based on Reinforcement Learning.

2019 IEEE Congress on Evolutionary Computation, CEC 2019 - Proceedings, 815–822. https://doi.org/10.1109/CEC.2019.8790035

Long, W., & Xu, S. (2016). A Novel Grey Wolf Optimizer for Global Optimization Problems. 2016 IEEE Advanced Information Management, Communicates, Electronic and Automation Control Conference (IMCEC). IEEE, 2016., 1, 1266–1270.

Long, W. (2016). Grey Wolf Optimizer based on Nonlinear Adjustment Control Parameter. In International Conference on Sensors, Mechatronics and Automation, 136(Icsma). https://doi.org/10.2991/icsma-16.2016.111

Long, W., Jiao, J., Liang, X., & Tang, M. (2018). Inspired grey wolf optimizer for solving large-scale function optimization problems. Applied Mathematical Modelling. https://doi.org/10.1016/j.apm.2018.03.005

Long, W., Liang, X., Cai, S., Jiao, J., & Zhang, W. (2016). A modified augmented Lagrangian with improved grey wolf optimization to constrained optimization problems. Neural Computing and Applications. https://doi.org/10.1007/s00521- 016-2357-x

López-ibáñez, M., Stützle, T., & Dorigo, M. (2018). Ant Colony Optimization : A



Component-Wise Overview. Handbook of Heuristics.

Lu, C., Gao, L., Li, X., & Xiao, S. (2017). A hybrid multi-objective grey wolf

optimizer for dynamic scheduling in a real-world welding industry. Engineering Applications of Artificial Intelligence, 57(October 2016), 61–79.


Lu, C., Xiao, S., Li, X., & Gao, L. (2016). Advances in Engineering Software An effective multi-objective discrete grey wolf optimizer for a real-world

scheduling problem in welding production. Advances in Engineering Software, 99, 161–176. https://doi.org/10.1016/j.advengsoft.2016.06.004

Luo, K., & Zhao, Q. (2019). A binary grey wolf optimizer for the multidimensional knapsack problem. Applied Soft Computing Journal, 83, 105645.


Ma, H., & Simon, D. (2017). Evolutionary computation with biogeography-based optimization. Wiley Online Library.

Mafarja, M., Aljarah, I., Heidari, A. A., Hammouri, A. I., Faris, H., Al-Zoubi, A. M.,

& Mirjalili, S. (2018). Evolutionary Population Dynamics and Grasshopper Optimization approaches for feature selection problems. Knowledge-Based Systems, 145(December), 25–45. https://doi.org/10.1016/j.knosys.2017.12.037 Mahmuddin, M., & Yusof, Y. (2009). A Hybrid Simplex Search and Bio-Inspired

Algorithm for Faster Convergence. 2009 International Conference on Machine Learning and Computing, 3(February 2015), 203–207.

http://www.ipcsit.net/vol3/037X242.pdf%5Cnmahmuddin2009-nm-bee- hybrid.pdf

Malik, M. R. S., & Ali, L. (2014). Weighted distance Grey wolf optimizer for global optimization problems. In 2015 IEEE International Conference on

Computational Intelligence and Computing Research (ICCIC), 1–6.

Mandli, I., & Mahesh, P. (2014). Selection of Most Relevant Features from High Dimensional Data using IG-GA Hybrid Approach. International Journal of Computer Science and Mobile Computing, 3(2), 827–830.

Matta, Z., & Lee, W. (2014). The digital universe of opportunities: rich data and the increasing value of the internet of things. 2(3), 1–9.




Mirjalili, S., Faris, H., & Aljarah. (2019). Evolutionary Machine Learning Techniques. Springer Singapore.

Mirjalili, S., Mirjalili, S. M., & Lewis, A. (2014). Advances in Engineering Software Grey Wolf Optimizer. Advances in Engineering Software, 69, 46–61.


Mirjalili, S., Mirjalili, S. M., & Yang, X. S. (2014). Binary bat algorithm. Neural Computing and Applications, 25(3–4), 663–681.


Mittal, N., & Singh, U. (2015). Distance-Based Residual Energy-Efficient Stable Election Protocol for WSNs. Arabian Journal for Science and Engineering, 40(6), 1637–1646. https://doi.org/10.1007/s13369-015-1641-x

Mittal, N., Singh, U., & Sohi, B. S. (2016). Modified Grey Wolf Optimizer for Global Engineering Optimization. Applied Computational Intelligence and Soft Computing, 2016, 1–16. https://doi.org/10.1155/2016/7950348

Mohamed, A. B., Idris, N. B., & Shanmugum, B. (2012). A brief introduction to intrusion detection system. Communications in Computer and Information Science, 330 CCIS, 263–271. https://doi.org/10.1007/978-3-642-35197-6_29 Mohammad Reisi-Nafchi, G. M. (2018). A New Competitive Binary Grey Wolf

Optimizer to Solve the Feature Selection Problem in EMG.


Molinaro, A. M., Simon, R., & Pfeiffer, R. M. (2005). Prediction error estimation: A comparison of resampling methods. Bioinformatics, 21(15), 3301–3307.


Moradi, P., & Rostami, M. (2015). Integration of graph clustering with ant colony optimization for feature selection. Knowledge-Based Systems, 84, 144–161.


Motoda, H., & Liu, H. (2002). Feature selection, extraction and construction. In Communication of IICM (Vol. 5, pp. 67–72). Taiwan.

Muangkote, N., Sunat, K., & Chiewchanwattana, S. (2014). An Improved Grey Wolf Optimizer for Training q -Gaussian Radial Basis Functional-link Nets. In 2014 International Computer Science and Engineering Conference (ICSEC), 209–




Nakamura, R. Y. M., Pereira, L. A. M., Costa, K. A., Rodrigues, D., Papa, J. P., &

Yang, X. S. (2012). BBA: A binary bat algorithm for feature selection.

Brazilian Symposium of Computer Graphic and Image Processing, 291–297.


Namasudra, S., Devi, D., Kadry, S., Sundarasekar, R., & Shanthini, A. (2020).

Towards DNA based data security in the cloud computing environment.

Computer Communications, 151(January), 539–547.


Nilsson, N. J. (1996). Artificial intelligence: A modern approach. In Artificial Intelligence (Vol. 82, Issues 1–2). https://doi.org/10.1016/0004-3702(96)00007- 0

Noto, K., Brodley, C., & Slonim, D. (2012). FRaC: A feature-modeling approach for semi-supervised and unsupervised anomaly detection. Data Mining and

Knowledge Discovery, 25(1), 109–133. https://doi.org/10.1007/s10618-011- 0234-x

Onut, I. V., & Ghorbani, A. A. (2007). A feature classification scheme for network intrusion detection. International Journal of Network Security, 5(1), 1–15.


Ostrowski, D. A. (2014). Feature selection for twitter classification. Proc. of International Conf. on Semantic Computing, 97, 267–272.


Otero, F. E. B., Freitas, A. A., & Johnson, C. G. (2013). A new sequential covering strategy for inducing classification rules with ant colony algorithms. IEEE Transactions on Evolutionary Computation, 17(1), 64–76.


Packianather, M. S., & Kapoor, B. (2015). A wrapper-based feature selection approach using Bees Algorithm for a wood defect classification system. 10th System of Systems Engineering Conference, SoSE 2015, 498–503.


Pham, D. T., Mahmuddin, M., & Otri, S. (2007). Application of the Bees Algorithm to the Selection Features for Manufacturing Data. Manufacturing Engineering, January.

Prabha, K., & Sudha, S. (2016). A Survey on IPS Methods and Techniques.



International Journal of Computer Science Issues, 13(2), 38–43.


Prasad, M. V. S., Kumar, S., & Maneesha. (2014). Feature Selection Using an Effective Dimensionality Reduction Technique. International Journal of Computer Science and Mobile Computing, 3(5), 480–485.

Quatrini, E., Costantino, F., Gravio, G. Di, & Patriarca, R. (2020). Machine learning for anomaly detection and process phase classification to improve safety and maintenance activities. Journal of Manufacturing Systems, 56(November 2019), 117–132. https://doi.org/10.1016/j.jmsy.2020.05.013

Raudys, S. J., & Jain, A. K. (1991). Small sample size effects in statistical pattern recognition: recommendations for practitioners. IEEE Transactions on Pattern Analysis and Machine Intelligence, 13(3), 252–264.


Reiss, T., Cohen, N., Bergman, L., & Hoshen, Y. (2020). PANDA : Adapting Pretrained Features for Anomaly Detection. ArXiv Preprint ArXiv:2010.05903.

Rodríguez, L., Castillo, O., & Soria, J. (2016). Grey Wolf Optimizer with dynamic adaptation of parameters using fuzzy logic. In 2016 IEEE Congress on

Evolutionary Computation (CEC), 3116–3123.

Rodríguez, L., Castillo, O., Soria, J., Melin, P., Valdez, F., Gonzalez, C. I., Martinez, G. E., & Soto, J. (2017). A Fuzzy Hierarchical Operator in the Grey Wolf Optimizer Algorithm. Applied Soft Computing Journal.


Sagban, R. (2016). Reactive Approach for Automating Exploration and Exploitation in Ant Colony Optimization. Universiti Utara Malaysia, Malaysia.

Sahoo, A., & Chandra, S. (2017). Multi-objective Grey Wolf Optimizer for

improved cervix lesion classification. Applied Soft Computing Journal, 52, 64–

80. https://doi.org/10.1016/j.asoc.2016.12.022

Sakthivel, S., Pandiyan, S. A., Marikani, S., & Selvi, S. K. (2013). Application of Big Bang Big Crunch Algorithm for Optimal Power Flow Problems. The International Journal Of Engineering And Science (IJES), 2(4), 41–47.

Salama, K., & Freitas, A. (2012). ABC-Miner: An ant-based Bayesian classification algorithm. Lecture Notes in Computer Science (Including Subseries Lecture



Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 7461 LNCS, 13–24. https://doi.org/10.1007/978-3-642-32650-9_2

Saremi, S., & Zahra, S. (2014). Evolutionary population dynamics and grey wolf optimizer. Neural Computing and Applications, 30.


Saroj, & Jyoti. (2014). Multi-objective genetic algorithm approach to feature subset optimization. Souvenir of the 2014 IEEE International Advance Computing Conference, IACC 2014, 544–548.


Schiezaro, M., & Pedrini, H. (2013). Data feature selection based on Artificial Bee Colony algorithm. EURASIP Journal on Image and Video Processing, 1–8.

Sharawi, M., Zawbaa, H. M., Emary, E., & Hossamzawbaagmailcom, E. (2017).

Feature Selection Approach Based on Whale Optimization Algorithm. In 2017 Ninth International Conference on Advanced Computational Intelligence (ICACI).

Shen, Q. (2005). Combining rough and fuzzy sets for feature selection. Proceedings of the 2005 UK Workshop on Computational Intelligence, UKCI 2005, 12–13.

Shi, Y., Tian, Y., Kou, G., Peng, Y., & Li, J. (2011). Network intrusion detection.

Advanced Information and Knowledge Processing, 9780857295033, 237–241.


Snaselova, P., & Zboril, F. (2015). Genetic algorithm using theory of chaos.

Procedia Computer Science, 51(1), 316–325.


Song, Q., Ni, J., & Wang, G. (2011). A Fast Clustering-Based Feature Subset

Selection Algorithm for High Dimensional Data. Knowledge Creation Diffusion Utilization, 99(X), 1–14. https://doi.org/10.1109/TKDE.2011.181

Song, J. (2016). Feature Selection for Intrusion Detection System. In Doctoral dissertation. Aberystwyth University.

Srivastava, S., Joshi, N., & Gaur, M. (2014). A Review Paper on Feature Selection Methodologies and Their Applications. International Journal of Computer Science and Network Security (IJCSNS), 14(5), 78.



Sudana, M., Nalluri, R., Saisujana, Reddy, H., & Swaminathan, V. (2017). An Efficient Feature Selection using Artificial Fish Swarm Optimization and SVM Classifier. Proc. of International Conf. on Networks & Advances in

Computational Technologies, 7, 407–411.

Sun, Y., Todorovic, S., & Goodison, S. (2010). Local-learning-based feature selection for high-dimensional data analysis. IEEE Transactions on Pattern Analysis and Machine Intelligence, 32(9), 1–18.

Sundaram, A. (2006). An Introduction to Intrusion Detection The need for Intrusion Detection Systems. Computer Security, 1(1), 10.

Syed, M. A., & Syed, R. (2019). Weighted Salp Swarm Algorithm and its applications towards optimal sensor deployment. Journal of King Saud University - Computer and Information Sciences, xxxx.


Tajbakhsh, A., Rahmati, M., & Mirzaei, A. (2009). Intrusion detection using fuzzy association rules. Applied Soft Computing Journal, 9(2), 462–469.


Talbi, E. G. (2009). Metaheuristics: from design to implementation. John Wiley &


Tasgetiren, M. F., Pan, Q. K., Kizilay, D., & Gao, K. (2016). A Variable Block Insertion Heuristic for the Blocking Flowshop Scheduling Problem with Total Flowtime Criterion. Algorithms, 9(4). https://doi.org/10.3390/a9040071 Tavallaee, M., Bagheri, E., Lu, W., & Ghorbani, A. A. (2009). A Detailed Analysis

of the KDD CUP 99 Data Set. Proc. of International Conf. on Computational Intelligence for Security and Defense Applications, 1–6.

Teng, Z. jun, Lv, J. ling, & Guo, L. wen. (2018). An improved hybrid grey wolf optimization algorithm. Soft Computing. https://doi.org/10.1007/s00500-018- 3310-y

Teng, Z. jun, Lv, J. ling, & Guo, L. wen. (2019). An improved hybrid grey wolf optimization algorithm. Soft Computing, 23(15), 6617–6631.


Tian, D. (2015). Particle swarm optimization with chaotic maps and gaussian ptimizationmutation for function optimization. International Journal of Grid


204 and Distributed Computing, 8(4), 123–134.


Tiwari, R. (2010). Correlation-based Attribute Selection using Genetic Algorithm.

International Journal of Computer Applications, 4(8), 28–34.


Too, J., Abdullah, A. R., & Saad, N. M. (2019). A new quadratic binary harris hawk optimization for feature selection. Electronics (Switzerland), 8(10), 1–27.


Too, J., Abdullah, A. R., Saad, N. M., Ali, N. M., & Tee, W. (2018). A New Competitive Binary Grey Wolf Optimizer to Solve the Feature Selection Problem in EMG. https://doi.org/10.3390/computers7040058

Varma, S. C., Murthy, K. S. L., & Srichandan, K. (2013). Gaussian Particle swarm optimization for combined economic emission dispatch. 2013 International Conference on Energy Efficient Technologies for Sustainability, ICEETS 2013, 1, 1336–1340. https://doi.org/10.1109/ICEETS.2013.6533581

Vigna, G., & Kruegel, C. (2006). Host-based Intrusion Detection.

Wang, X., Yang, J., Teng, X., Xia, W., & Jensen, R. (2007). Feature selection based on rough sets and particle swarm optimization. Pattern Recognition Letters, 28(4), 459–471. https://doi.org/10.1016/j.patrec.2006.09.003

Wei, Y., Ni, N., Liu, D., Chen, H., Wang, M., Li, Q., Cui, X., & Ye, H. (2017). An improved grey wolf optimization strategy enhanced SVM and its application in predicting the second major. Mathematical Problems in Engineering, 2017.

Weston, J., & Watkins, C. (1999). Support Vector Machines for Multi-Class Pattern Recognition. Proceedings of the 7th European Symposium on Artificial Neural Networks (ESANN-99), April, 219–224.

Wu, T., Liu, Y., Tang, W., Li, X., & Yu, Y. (2017). Constraint genetic algorithm and its application in sintering proportioning. IOP Conference Series: Materials Science and Engineering, 231(1). https://doi.org/10.1088/1757-


Xia, X. (2012). Particle Swarm Optimization Method Based on Chaotic Local Search and Roulette Wheel Mechanism. Physics Procedia, 24, 269–275.





Related subjects :