Anomaly detection is still a difficult task. To address this problem, we propose to strengthen DBSCAN algorithm for the data by converting all data to the graph concept frame (CFG). As is well known that the work DBSCAN method used to compile the data set belong to the same species in a while it will be considered in the external behavior of the cluster as a noise or anomalies. It can detect anomalies by DBSCAN algorithm can detect abnormal points that are far from certain set threshold (extremism). However, the abnormalities are not those cases, abnormal and unusual or far from a specific group, There is a type of data that is do not happen repeatedly, but are considered abnormal for the group of known. The analysis showed DBSCAN using the improved algorithm can detect this type of anomaly. Thus, our approach is effective in finding abnormalities.
The need for quick airborne transportation is critical, especially in emergencies. Drones with suspended payloads might be used to accomplish quick airborne transportation. Due to the environment or the drone's motion, the slung load may oscillate and lead the drone to fall. The altitude and attitude controls are the backbones of the drone's stability, and they must be adequately designed. Because of their symmetrical and simple structure, quadrotor helicopters are one of the most popular drone classes. In this work, a genetic algorithm with two weighted terms fitness function is used to adjust a Proportional-Integral-Derivative (PID) controller to compensate for the altitude and attitude controllers in a quadrotor drone with a slun
... Show MoreWith the wide developments of computer science and applications of networks, the security of information must be increased and make it more complex. The most important issues is how to control and prevent unauthorized access to secure information, therefore this paper presents a combination of two efficient encryption algorithms to satisfy the purpose of information security by adding a new level of encryption in Rijndael-AES algorithm. This paper presents a proposed Rijndael encryption and decryption process with NTRU algorithm, Rijndael algorithm is important because of its strong encryption. The proposed updates are represented by encryption and decryption Rijndael S-Box using NTRU algorithm. These modifications enhance the degree of
... Show MoreWith the wide developments of computer applications and networks, the security of information has high attention in our common fields of life. The most important issues is how to control and prevent unauthorized access to secure information, therefore this paper presents a combination of two efficient encryption algorithms to satisfy the purpose of information security by adding a new level of encryption in Rijndael-AES algorithm. This paper presents a proposed Rijndael encryption and decryption process with NTRU algorithm, Rijndael algorithm is widely accepted due to its strong encryption, and complex processing as well as its resistance to brute force attack. The proposed modifications are implemented by encryption and decryption Rijndael
... Show MoreThis study aims to determine the prevalence of Entamoeba histolytica, Entamoeba dispar and
Entamoeba moshkovskii by three methods of diagnosis (microscopic examination, cultivation and PCR) that
were compared to obtain an accurate diagnosis of Entamoeba spp. during amoebiasis. Total (n=150) stool
samples related to patients were (n = 100) and healthy controls (n= 50). Clinically diagnosed stool samples
(n=100) were collected from patients attending the consultant clinics of different hospitals in Basrah during
the period from January 2018 to January 2019. The results showed that 60% of collected samples were
positive in a direct microscopic examination. All samples were cultivated on different media; the Bra
Feature selection (FS) constitutes a series of processes used to decide which relevant features/attributes to include and which irrelevant features to exclude for predictive modeling. It is a crucial task that aids machine learning classifiers in reducing error rates, computation time, overfitting, and improving classification accuracy. It has demonstrated its efficacy in myriads of domains, ranging from its use for text classification (TC), text mining, and image recognition. While there are many traditional FS methods, recent research efforts have been devoted to applying metaheuristic algorithms as FS techniques for the TC task. However, there are few literature reviews concerning TC. Therefore, a comprehensive overview was systematicall
... Show MoreIn many scientific fields, Bayesian models are commonly used in recent research. This research presents a new Bayesian model for estimating parameters and forecasting using the Gibbs sampler algorithm. Posterior distributions are generated using the inverse gamma distribution and the multivariate normal distribution as prior distributions. The new method was used to investigate and summaries Bayesian statistics' posterior distribution. The theory and derivation of the posterior distribution are explained in detail in this paper. The proposed approach is applied to three simulation datasets of 100, 300, and 500 sample sizes. Also, the procedure was extended to the real dataset called the rock intensity dataset. The actual dataset is collecte
... Show MoreCancer is in general not a result of an abnormality of a single gene but a consequence of changes in many genes, it is therefore of great importance to understand the roles of different oncogenic and tumor suppressor pathways in tumorigenesis. In recent years, there have been many computational models developed to study the genetic alterations of different pathways in the evolutionary process of cancer. However, most of the methods are knowledge-based enrichment analyses and inflexible to analyze user-defined pathways or gene sets. In this paper, we develop a nonparametric and data-driven approach to testing for the dynamic changes of pathways over the cancer progression. Our method is based on an expansion and refinement of the pathway bei
... Show MoreLet be a non-trivial simple graph. A dominating set in a graph is a set of vertices such that every vertex not in the set is adjacent to at least one vertex in the set. A subset is a minimum neighborhood dominating set if is a dominating set and if for every holds. The minimum cardinality of the minimum neighborhood dominating set of a graph is called as minimum neighborhood dominating number and it is denoted by . A minimum neighborhood dominating set is a dominating set where the intersection of the neighborhoods of all vertices in the set is as small as possible, (i.e., ). The minimum neighborhood dominating number, denoted by , is the minimum cardinality of a minimum neighborhood dominating set. In other words, it is the
... Show MoreThe non static chain is always the problem of static analysis so that explained some of theoretical work, the properties of statistical regression analysis to lose when using strings in statistic and gives the slope of an imaginary relation under consideration. chain is not static can become static by adding variable time to the multivariate analysis the factors to remove the general trend as well as variable placebo seasons to remove the effect of seasonal .convert the data to form exponential or logarithmic , in addition to using the difference repeated d is said in this case it integrated class d. Where the research contained in the theoretical side in parts in the first part the research methodology ha
... Show More