Recent research has shown that a Deoxyribonucleic Acid (DNA) has ability to be used to discover diseases in human body as its function can be used for an intrusion-detection system (IDS) to detect attacks against computer system and networks traffics. Three main factor influenced the accuracy of IDS based on DNA sequence, which is DNA encoding method, STR keys and classification method to classify the correctness of proposed method. The pioneer idea on attempt a DNA sequence for intrusion detection system is using a normal signature sequence with alignment threshold value, later used DNA encoding based cryptography, however the detection rate result is very low. Since the network traffic consists of 41 attributes, therefore we proposed the most possible less character number (same DNA length) which is four-character DNA encoding that represented all 41 attributes known as DEM4all. The experiments conducted using standard data KDDCup 99 and NSL-KDD. Teiresias algorithm is used to extract Short Tandem Repeat (STR), which includes both keys and their positions in the network traffic, while Brute-force algorithm is used as a classification process to determine whether the network traffic is attack or normal. Experiment run 30 times for each DNA encoding method. The experiment result shows that proposed method has performed better accuracy (15% improved) compare with previous and state of the art DNA algorithms. With such results it can be concluded that the proposed DEM4all DNA encoding method is a good method that can used for IDS. More complex encoding can be proposed that able reducing less number of DNA sequence can possible produce more detection accuracy.
DNA methylation is one of the main epigenetic mechanisms in cancer development and progression. Aberrant DNA methylation of CpG islands within promoter regions contributes to the dysregulation of various tumor suppressors and oncogenes; this leads to the appearance of malignant features, including rapid proliferation, metastasis, stemness, and drug resistance. The discovery of two important protein families, DNA methyltransferases (DNMTs) and Ten-eleven translocation (TET) dioxygenases, respectively, which are responsible for deregulated transcription of genes that play pivotal roles in tumorigenesis, led to further understanding of DNA methylation-related pathways. But how these enzymes can target specific genes in different malignancies;
... Show MoreDeepFake is a concern for celebrities and everyone because it is simple to create. DeepFake images, especially high-quality ones, are difficult to detect using people, local descriptors, and current approaches. On the other hand, video manipulation detection is more accessible than an image, which many state-of-the-art systems offer. Moreover, the detection of video manipulation depends entirely on its detection through images. Many worked on DeepFake detection in images, but they had complex mathematical calculations in preprocessing steps, and many limitations, including that the face must be in front, the eyes have to be open, and the mouth should be open with the appearance of teeth, etc. Also, the accuracy of their counterfeit detectio
... Show MoreDiabetic retinopathy (DR) is a diabetes- caused disease that is associated with leakage of fluid from the blood vessels into the retina, leading to its damage. It is one of the most common diseases that can lead to weak vision and even blindness. Exudates is a clear indication of diabetic retinopathy, which is the main cause of blindness in people with diabetes. Therefore, early detection of exudates is a crucial and essential step to prevent blindness and vision loss is in the analysis of digital diabetic retinopathy systems. This paper presents an improved approach for detection of exudates in retina image using supervised-unsupervised Minimum Distance (MD) segmentation method. The suggested system includes three stages; First, a
... Show MoreEvolutionary algorithms (EAs), as global search methods, are proved to be more robust than their counterpart local heuristics for detecting protein complexes in protein-protein interaction (PPI) networks. Typically, the source of robustness of these EAs comes from their components and parameters. These components are solution representation, selection, crossover, and mutation. Unfortunately, almost all EA based complex detection methods suggested in the literature were designed with only canonical or traditional components. Further, topological structure of the protein network is the main information that is used in the design of almost all such components. The main contribution of this paper is to formulate a more robust E
... Show MoreThis paper present a simple and sensitive method for the determination of DL-Histidine using FIA-Chemiluminometric measurement resulted from oxidation of luminol molecule by hydrogen peroxide in alkaline medium in the presence of DL-Histidine. Using 70?l. sample linear plot with a coefficient of determination 95.79% for (5-60) mmol.L-1 while for a quadratic relation C.O.D = 96.44% for (5-80) mmol.L-1 and found that guadratic plot in more representative. Limit of detection was 31.93 ?g DL-Histidine (S/N = 3), repeatability of measurement was less that 5% (n=6). Positive and negative ion interferances was removed by using minicolume containing ion exchange resin located after injection valve position.
Many approaches of different complexity already exist to edge detection in
color images. Nevertheless, the question remains of how different are the results
when employing computational costly techniques instead of simple ones. This
paper presents a comparative study on two approaches to color edge detection to
reduce noise in image. The approaches are based on the Sobel operator and the
Laplace operator. Furthermore, an efficient algorithm for implementing the two
operators is presented. The operators have been applied to real images. The results
are presented in this paper. It is shown that the quality of the results increases by
using second derivative operator (Laplace operator). And noise reduced in a good
This research deals with processing and Interpretation of Bouguer anomaly gravity field, using two dimensional filtering techniques to separate the residual gravity field from the Bouguer gravity map for a part of Najaf Ashraf province in Iraq. The residual anomaly processed in order to reduce noise and give a more comprehensive vision about subsurface linear structures. Results for descriptive interpretation presented as colored surfaces and contour maps in order to locate directions and extensions of linear features which may interpret as faults. A comparison among gravity residual field , 1st derivative and horizontal gradient made along a profile across the study area in order to assign the exact location of a major fault. Furthermor
... Show MoreLost circulation or losses in drilling fluid is one of the most important problems in the oil and gas industry, and it appeared at the beginning of this industry, which caused many problems during the drilling process, which may lead to closing the well and stopping the drilling process. The drilling muds are relatively expensive, especially the muds that contain oil-based mud or that contain special additives, so it is not economically beneficial to waste and lose these muds. The treatment of drilling fluid losses is also somewhat expensive as a result of the wasted time that it caused, as well as the high cost of materials used in the treatment such as heavy materials, cement, and others. The best way to deal with drilling fluid losses
... Show More