With the proliferation of both Internet access and data traffic, recent breaches have brought into sharp focus the need for Network Intrusion Detection Systems (NIDS) to protect networks from more complex cyberattacks. To differentiate between normal network processes and possible attacks, Intrusion Detection Systems (IDS) often employ pattern recognition and data mining techniques. Network and host system intrusions, assaults, and policy violations can be automatically detected and classified by an Intrusion Detection System (IDS). Using Python Scikit-Learn the results of this study show that Machine Learning (ML) techniques like Decision Tree (DT), Naïve Bayes (NB), and K-Nearest Neighbor (KNN) can enhance the effectiveness of an Intrusion Detection System (IDS). Success is measured by a variety of metrics, including accuracy, precision, recall, F1-Score, and execution time. Applying feature selection approaches such as Analysis of Variance (ANOVA), Mutual Information (MI), and Chi-Square (Ch-2) reduced execution time, increased detection efficiency and accuracy, and boosted overall performance. All classifiers achieve the greatest performance with 99.99% accuracy and the shortest computation time of 0.0089 seconds while using ANOVA with 10% of features.
The main objective of the central bank is to achieve price stability and target in fractionates. Therefore, the bank sought to use modern tools and policies in order to reduce the negative effects of the accumulation of foreign reserves represented by monetary sterilization, similar to developed and developing countries alike, but with different available tools that are possible and imposed by the local financial and monetary environments, such as the window for buying and selling foreign currency, open market operations and deposit facilities. And lending existing. Because any in crease in the monetary base resulting from the accumulation of foreign reserves will affect price stability directly due to the consumer nature of the
... Show MoreDigital tampering identification, which detects picture modification, is a significant area of image analysis studies. This area has grown with time with exceptional precision employing machine learning and deep learning-based strategies during the last five years. Synthesis and reinforcement-based learning techniques must now evolve to keep with the research. However, before doing any experimentation, a scientist must first comprehend the current state of the art in that domain. Diverse paths, associated outcomes, and analysis lay the groundwork for successful experimentation and superior results. Before starting with experiments, universal image forensics approaches must be thoroughly researched. As a result, this review of variou
... Show MoreThis paper deals with the modeling of a preventive maintenance strategy applied to a single-unit system subject to random failures.
According to this policy, the system is subjected to imperfect periodic preventive maintenance restoring it to ‘as good as new’ with probability
p and leaving it at state ‘as bad as old’ with probability q. Imperfect repairs are performed following failures occurring between consecutive
preventive maintenance actions, i.e the times between failures follow a decreasing quasi-renewal process with parameter a. Considering the
average durations of the preventive and corrective maintenance actions a
... Show MoreFive Saccharomyces cerevisiae isolated from the ability of chitinase production from the isolates were studied. Quantitative screening appeared that Saccharomyces cerevisiae S4 was the highest chitinase producer specific activity 1.9 unit/mg protein. The yeast was culture in liquid and solid state fermentation media (SSF). Different plant obstanases were used for (SSF) with the chitine, while liquid media contained chitine with the diffrented nitrogen source. The favorable condition for chitinase producers were incubated at 30 ºC at pH 6 and 1% colloidal chitine.
In this study, we focused on the random coefficient estimation of the general regression and Swamy models of panel data. By using this type of data, the data give a better chance of obtaining a better method and better indicators. Entropy's methods have been used to estimate random coefficients for the general regression and Swamy of the panel data which were presented in two ways: the first represents the maximum dual Entropy and the second is general maximum Entropy in which a comparison between them have been done by using simulation to choose the optimal methods.
The results have been compared by using mean squares error and mean absolute percentage error to different cases in term of correlation valu
... Show MoreBackground: Prostatic adenocarcinoma is the most widely recognized malignancy in men and the second cause of cancer-related mortality encountered in male patients after lung cancer.
Aim of the study: To assess the diagnostic value of diffusion weighted imaging (DWI) and its quantitative measurement, apparent diffusion coefficient (ADC), in the identification and localization of prostatic cancer compared with T2 weighted image sequence (T2WI).
Type of the study: a prospective analytic study
Patients and methods: forty-one male patients with suspected prostatic cancer were examined by pelvic MRI at the MRI department of the Oncology Teaching Hospital/Medical City in Baghdad
... Show MoreCuO nanoparticles were synthesized in two different ways, firstly by precipitation method using copper acetate monohydrate Cu(CO2CH13)2·H2O, glacial acetic acid (CH3COOH) and sodium hydroxide(NaOH), and secondly by sol-gel method using copper chloride(CuCl2), sodium hydroxide (NaOH) and ethanol (C2H6O). Results of scanning electron microscopy (SEM) showed that different CuO nanostructures (spherical and Reef) can be formed using precipitation and sol- gel process, respectively, at which the particle size was found to be less than 2 µm. X-ray diffraction (XRD)manifested that the pure synthesized powder has no inclusions that may exist during preparations. XRD result
... Show MoreThe current study was designed to investigate the presence of aflatoxin M1 in 25 samples of pasteurized canned milk which collected randomly from some Iraqi local markets using ELISA technique. Aflatoxin M1 was present in 21 samples, the concentration of aflatoxin M1 ranged from (0.25-50 ppb). UV radiation (365nm wave length) was used for detoxification of aflatoxin M1 (sample with highest concentration /50 ppb of aflatoxin M1 in two different volumes ((25 & 50 ml)) for two different time (15 & 30 min) and 30, 60, 90 cm distance between lamp and milk layer were used for this purpose). Results showed that distance between lamp and milk layer was the most effective parameter in reduction of aflatoxin M1, and whenever the distance increase the
... Show MoreThe acceptance sampling plans for generalized exponential distribution, when life time experiment is truncated at a pre-determined time are provided in this article. The two parameters (α, λ), (Scale parameters and Shape parameters) are estimated by LSE, WLSE and the Best Estimator’s for various samples sizes are used to find the ratio of true mean time to a pre-determined, and are used to find the smallest possible sample size required to ensure the producer’s risks, with a pre-fixed probability (1 - P*). The result of estimations and of sampling plans is provided in tables.
Key words: Generalized Exponential Distribution, Acceptance Sampling Plan, and Consumer’s and Producer Risks
... Show MoreThe rise of edge-cloud continuum computing is a result of the growing significance of edge computing, which has become a complementary or substitute option for traditional cloud services. The convergence of networking and computers presents a notable challenge due to their distinct historical development. Task scheduling is a major challenge in the context of edge-cloud continuum computing. The selection of the execution location of tasks, is crucial in meeting the quality-of-service (QoS) requirements of applications. An efficient scheduling strategy for distributing workloads among virtual machines in the edge-cloud continuum data center is mandatory to ensure the fulfilment of QoS requirements for both customer and service provider. E
... Show More