In this paper, the botnet detection problem is defined as a feature selection problem and the genetic algorithm (GA) is used to search for the best significant combination of features from the entire search space of set of features. Furthermore, the Decision Tree (DT) classifier is used as an objective function to direct the ability of the proposed GA to locate the combination of features that can correctly classify the activities into normal traffics and botnet attacks. Two datasets namely the UNSW-NB15 and the Canadian Institute for Cybersecurity Intrusion Detection System 2017 (CICIDS2017), are used as evaluation datasets. The results reveal that the proposed DT-aware GA can effectively find the relevant features from the whole features set. Thus, it obtains efficient botnet detection results in terms of F-score, precision, detection rate, and number of relevant features, when compared with DT alone.
In this paper, the construction of Hermite wavelets functions and their operational matrix of integration is presented. The Hermite wavelets method is applied to solve nth order Volterra integro diferential equations (VIDE) by expanding the unknown functions, as series in terms of Hermite wavelets with unknown coefficients. Finally, two examples are given
Emergency vehicle (EV) services save lives around the world. The necessary fast response of EVs requires minimising travel time. Preempting traffic signals can enable EVs to reach the desired location quickly. Most of the current research tries to decrease EV delays but neglects the resulting negative impacts of the preemption on other vehicles in the side roads. This paper proposes a dynamic preemption algorithm to control the traffic signal by adjusting some cycles to balance between the two critical goals: minimal delay for EVs with no stop, and a small additional delay to the vehicles on the side roads. This method is applicable to preempt traffic lights for EVs through an Intelli
Data Driven Requirement Engineering (DDRE) represents a vision for a shift from the static traditional methods of doing requirements engineering to dynamic data-driven user-centered methods. Data available and the increasingly complex requirements of system software whose functions can adapt to changing needs to gain the trust of its users, an approach is needed in a continuous software engineering process. This need drives the emergence of new challenges in the discipline of requirements engineering to meet the required changes. The problem in this study was the method in data discrepancies which resulted in the needs elicitation process being hampered and in the end software development found discrepancies and could not meet the need
... Show MoreThe purpose of this study was to know the reality of motivational administrative methods for academic decision-makers in the faculties of physical education and sports sciences in Baghdad from the perspective of faculty members. To solve the nature of the current problem, the two researchers used the descriptive approach of the survey method. The two researchers determined their research community by limiting the sample to all faculty members in the faculties of physical education and sports sciences in Baghdad (Al-Mustansiriya University, Al-Jadriya Univeristy & Al-Waziriyah University). Their number reached 314 faculty members and the two researchers determined their research community by 90%, so the research sam
... Show MoreThe use of credit cards for online purchases has significantly increased in recent years, but it has also led to an increase in fraudulent activities that cost businesses and consumers billions of dollars annually. Detecting fraudulent transactions is crucial for protecting customers and maintaining the financial system's integrity. However, the number of fraudulent transactions is less than legitimate transactions, which can result in a data imbalance that affects classification performance and bias in the model evaluation results. This paper focuses on processing imbalanced data by proposing a new weighted oversampling method, wADASMO, to generate minor-class data (i.e., fraudulent transactions). The proposed method is based on th
... Show MoreResearchers employ behavior based malware detection models that depend on API tracking and analyzing features to identify suspected PE applications. Those malware behavior models become more efficient than the signature based malware detection systems for detecting unknown malwares. This is because a simple polymorphic or metamorphic malware can defeat signature based detection systems easily. The growing number of computer malwares and the detection of malware have been the concern for security researchers for a large period of time. The use of logic formulae to model the malware behaviors is one of the most encouraging recent developments in malware research, which provides alternatives to classic virus detection methods. To address the l
... Show MoreIn this paper, new brain tumour detection method is discovered whereby the normal slices are disassembled from the abnormal ones. Three main phases are deployed including the extraction of the cerebral tissue, the detection of abnormal block and the mechanism of fine-tuning and finally the detection of abnormal slice according to the detected abnormal blocks. Through experimental tests, progress made by the suggested means is assessed and verified. As a result, in terms of qualitative assessment, it is found that the performance of proposed method is satisfactory and may contribute to the development of reliable MRI brain tumour diagnosis and treatments.
Many carbonate reservoirs in the world show a tilted in originally oil-water contact (OOWC) which requires a special consideration in the selection of the capillary pressure curves and an understanding of reservoir fluids distribution while initializing the reservoir simulation models.
An analytical model for predicting the capillary pressure across the interface that separates two immiscible fluids was derived from reservoir pressure transient analysis. The model reflected the entire interaction between the reservoir-aquifer fluids and rock properties measured under downhole reservoir conditions.
This model retained the natural coupling of oil reservoirs with the aquifer zone and treated them as an explicit-region composite system