Preferred Language
Articles
/
jBZerIoBVTCNdQwCnaJd
BotDetectorFW: an optimized botnet detection framework based on five features-distance measures supported by comparisons of four machine learning classifiers using CICIDS2017 dataset
...Show More Authors

<p><span>A Botnet is one of many attacks that can execute malicious tasks and develop continuously. Therefore, current research introduces a comparison framework, called BotDetectorFW, with classification and complexity improvements for the detection of Botnet attack using CICIDS2017 dataset. It is a free online dataset consist of several attacks with high-dimensions features. The process of feature selection is a significant step to obtain the least features by eliminating irrelated features and consequently reduces the detection time. This process implemented inside BotDetectorFW using two steps; data clustering and five distance measure formulas (cosine, dice, driver &amp; kroeber, overlap, and pearson correlation) using C#, followed by selecting the best N features used as input into four classifier algorithms evaluated using machine learning (WEKA); multilayerperceptron, JRip, IBK, and random forest. In BotDetectorFW, the thoughtful and diligent cleaning of the dataset within the preprocessing stage beside the normalization, binary clustering of its features, followed by the adapting of feature selection based on suitable feature distance techniques, and finalized by testing of selected classification algorithms. All together contributed in satisfying the high-performance metrics using fewer features number (8 features as a minimum) compared to and outperforms other methods found in the literature that adopted (10 features or higher) using the same dataset. Furthermore, the results and performance evaluation of BotDetectorFM shows a competitive impact in terms of classification accuracy (ACC), precision (Pr), recall (Rc), and f-measure (F1) metrics.</span></p>

Scopus Crossref
View Publication
Publication Date
Fri Apr 14 2023
Journal Name
Journal Of Big Data
A survey on deep learning tools dealing with data scarcity: definitions, challenges, solutions, tips, and applications
...Show More Authors
Abstract<p>Data scarcity is a major challenge when training deep learning (DL) models. DL demands a large amount of data to achieve exceptional performance. Unfortunately, many applications have small or inadequate data to train DL frameworks. Usually, manual labeling is needed to provide labeled data, which typically involves human annotators with a vast background of knowledge. This annotation process is costly, time-consuming, and error-prone. Usually, every DL framework is fed by a significant amount of labeled data to automatically learn representations. Ultimately, a larger amount of data would generate a better DL model and its performance is also application dependent. This issue is the main barrier for</p> ... Show More
View Publication Preview PDF
Scopus (628)
Crossref (628)
Scopus Clarivate Crossref
Publication Date
Fri Nov 01 2024
Journal Name
Process Safety And Environmental Protection
Optimized ensemble deep random vector functional link with nature inspired algorithm and boruta feature selection: Multi-site intelligent model for air quality index forecasting
...Show More Authors

View Publication
Scopus (8)
Crossref (9)
Scopus Clarivate Crossref
Publication Date
Wed Mar 10 2021
Journal Name
Baghdad Science Journal
Image data compression by using multiwavelete for color image
...Show More Authors

There are many images you need to large Khoznah space With the continued evolution of storage technology for computers, there is a need nailed required to reduce Alkhoznip space for pictures and image compression in a good way, the conversion method Alamueja

View Publication Preview PDF
Publication Date
Sun Oct 31 2021
Journal Name
Periodicals Of Engineering And Natural Sciences (pen)
Image segmentation by using thresholding technique in two stages
...Show More Authors

Image segmentation can be defined as a cutting or segmenting process of the digital image into many useful points which are called segmentation, that includes image elements contribute with certain attributes different form Pixel that constitute other parts. Two phases were followed in image processing by the researcher in this paper. At the beginning, pre-processing image on images was made before the segmentation process through statistical confidence intervals that can be used for estimate of unknown remarks suggested by Acho & Buenestado in 2018. Then, the second phase includes image segmentation process by using "Bernsen's Thresholding Technique" in the first phase. The researcher drew a conclusion that in case of utilizing

... Show More
View Publication
Crossref
Publication Date
Tue Dec 31 2019
Journal Name
Al-qadisiyah Journal For Engineering Sciences
Improving Gypseous Soil Properties by Using Non-Traditional Additives
...Show More Authors

Gypseous soils are common in several regions in the world including Iraq, where more than 28.6% of its surface is covered with this type of soil. This soil, with high gypsum content, causes different problems for construction and strategic projects. As a result of water flow through the soil mass, the permeability and chemical arrangement of these soils varies with time due to the solubility and leaching of gypsum. In this study, the soil of 36% gypsum content, was taken from one location about 100 km southwest of Baghdad, where the samples were taken from depths (0.5 - 1) m below the natural ground and mixed with (3%, 6%, 9%) of Copolymer and Novolac polymer to improve the engineering properties that include: collapsibility, perm

... Show More
View Publication Preview PDF
Crossref
Publication Date
Tue Oct 01 2019
Journal Name
Journal Of Engineering
Carwash Wastewater Treatment by Electrocoagulation Using Aluminum Foil Electrodes
...Show More Authors

Large quantities of contaminated carwash wastewater are produced per day from carwash places. Extensively it contains large quantities of chemicals from detergents, oil, grease, heavy metals, suspended solids, types of hydrocarbons, and biological contents. A novel electrocoagulation treatment by foil electrodes was conducted to remove COD, turbidity, Total Dissolved Solids (TDS) from contaminated carwash wastewater and decrease its Electrical Conductivity (EC). A thin layer of aluminum foil is used as an electrode in this treatment process. The effects of different voltage and treatment times were studied. The best result was found at a voltage of 30 volts and treatment time 90 minute where the removal efficiency of COD

... Show More
View Publication Preview PDF
Crossref (8)
Crossref
Publication Date
Sun Feb 03 2019
Journal Name
Iraqi Journal Of Physics
Secure communications by chaotic carrier signal using Lorenz model
...Show More Authors

In this paper, the generation of a chaotic carrier by Lorenz model
is theoretically studied. The encoding techniques has been used is
chaos masking of sinusoidal signal (massage), an optical chaotic
communications system for different receiver configurations is
evaluated. It is proved that chaotic carriers allow the successful
encoding and decoding of messages. Focusing on the effect of
changing the initial conditions of the states of our dynamical system
e.i changing the values (x, y, z, x1, y1, and z1).

View Publication Preview PDF
Crossref (2)
Crossref
Publication Date
Wed Oct 06 2021
Journal Name
Periodicals Of Engineering And Natural Sciences (pen)
Image segmentation by using thresholding technique in two stages
...Show More Authors

View Publication
Scopus (2)
Scopus Crossref
Publication Date
Sat Apr 23 2016
Journal Name
Iraqi Journal Of Agricultural Sciences
MEASURRING COW FARMS EFFICIENCY BY USING THE QUALITY RESPONSE
...Show More Authors

In despite of the expansion of using the dummy variables as a explanatory variables, but their using as a dependent variables is still limited, and the reason of that may be return to may problems when using dummy variables as a dependent variables. the study aimed to using the quality Response Models to Measuring Efficiency of cows farms by random sample including (19) farm from (Abi gherak district). The study estimating the transcendental logarithmic production function by using stochastic frontier Analysis (SFA) to interpret the relation between the return achieved from the cows farms as a dependent variables and each of labor and capital as an independent variables. the function indicates that increasing in labor by (100%) will

... Show More
View Publication Preview PDF
Crossref (1)
Crossref
Publication Date
Fri Oct 30 2020
Journal Name
Journal Of Economics And Administrative Sciences
Estimate The Survival Function By Using The Genetic Algorithm
...Show More Authors

  Survival analysis is the analysis of data that are in the form of times from the origin of time until the occurrence of the end event, and in medical research, the origin of time is the date of registration of the individual or the patient in a study such as clinical trials to compare two types of medicine or more if the endpoint It is the death of the patient or the disappearance of the individual. The data resulting from this process is called survival times. But if the end is not death, the resulting data is called time data until the event. That is, survival analysis is one of the statistical steps and procedures for analyzing data when the adopted variable is time to event and time. It could be d

... Show More
View Publication Preview PDF
Crossref (1)
Crossref