Abstract: The utility of DNA sequencing in diagnosing and prognosis of diseases is vital for assessing the risk of genetic disorders, particularly for asymptomatic individuals with a genetic predisposition. Such diagnostic approaches are integral in guiding health and lifestyle decisions and preparing families with the necessary foreknowledge to anticipate potential genetic abnormalities. The present study explores implementing a define-by-run deep learning (DL) model optimized using the Tree-structured Parzen estimator algorithm to enhance the precision of genetic diagnostic tools. Unlike conventional models, the define-by-run model bolsters accuracy through dynamic adaptation to data during the learning process and iterative optimization of critical hyperparameters, such as layer count, neuron count per layer, learning rate, and batch size. Utilizing a diverse dataset comprising DNA sequences fromtwo distinct groups: patients diagnosed with breast cancer and a control group of healthy individuals. The model showcased remarkable performance, with accuracy, precision, recall, F1-score, and area under the curve metrics reaching 0.871, 0.872, 0.871, 0.872, and 0.95, respectively, outperforming previous models. These findings underscore the significant potential of DL techniques in amplifying the accuracy of disease diagnosis and prognosis through DNA sequencing, indicating substantial advancements in personalized medicine and genetic counseling. Collectively, the findings of this investigation suggest that DL presents transformative potential in the landscape of genetic disorder diagnosis and management.
Incremental sheet metal forming is a modern technique of sheet metal forming in which a uniform sheet is locally deformed during the progressive action of a forming tool. The tool movement is governed by a CNC milling machine. The tool locally deforms by this way the sheet with pure deformation stretching. In SPIF process, the research is concentrate on the development of predict models for estimate the product quality. Using simulated annealing algorithm (SAA), Surface quality in SPIF has been modeled. In the development of this predictive model, spindle speed, feed rate and step depth have been considered as model parameters. Maximum peak height (Rz) and Arithmetic mean surface roughness (Ra) are used as response parameter to assess th
... Show MoreIn the petroleum industry, multiphase flow dynamics within the tubing string have gained significant attention due to associated challenges. Accurately predicting pressure drops and wellbore pressures is crucial for the effective modeling of vertical lift performance (VLP). This study focuses on predicting the multiphase flow behavior in four wells located in the Faihaa oil field in southern Iraq, utilizing PIPESIM software. The process of selecting the most appropriate multiphase correlation was performed by utilizing production test data to construct a comprehensive survey data catalog. Subsequently, the results were compared with the correlations available within the PIPESIM software. The outcomes reveal that the Hagedorn and Brown (H
... Show MoreThe Jeribe reservoir in the Jambour Oil Field is a complex and heterogeneous carbonate reservoir characterized by a wide range of permeability variations. Due to limited availability of core plugs in most wells, it becomes crucial to establish correlations between cored wells and apply them to uncored wells for predicting permeability. In recent years, the Flow Zone Indicator (FZI) approach has gained significant applicability for predicting hydraulic flow units (HFUs) and identifying rock types within the reservoir units.
This paper aims to develop a permeability model based on the principles of the Flow Zone Indicator. Analysis of core permeability versus core porosity plot and Reservoir Quality Index (RQI) - Normalized por
... Show MoreOpenStreetMap (OSM), recognised for its current and readily accessible spatial database, frequently serves regions lacking precise data at the necessary granularity. Global collaboration among OSM contributors presents challenges to data quality and uniformity, exacerbated by the sheer volume of input and indistinct data annotation protocols. This study presents a methodological improvement in the spatial accuracy of OSM datasets centred over Baghdad, Iraq, utilising data derived from OSM services and satellite imagery. An analytical focus was placed on two geometric correction methods: a two-dimensional polynomial affine transformation and a two-dimensional polynomial conformal transformation. The former involves twelve coefficients for ad
... Show MoreThe main objective of this research is to use the methods of calculus ???????? solving integral equations Altbataah When McCann slowdown is a function of time as the integral equation used in this research is a kind of Volterra
This research introduce a study with application on Principal Component Regression obtained from some of the explainatory variables to limitate Multicollinearity problem among these variables and gain staibilty in their estimations more than those which yield from Ordinary Least Squares. But the cost that we pay in the other hand losing a little power of the estimation of the predictive regression function in explaining the essential variations. A suggested numerical formula has been proposed and applied by the researchers as optimal solution, and vererifing the its efficiency by a program written by the researchers themselves for this porpuse through some creterions: Cumulative Percentage Variance, Coefficient of Determination, Variance
... Show MoreIn this study, dynamic encryption techniques are explored as an image cipher method to generate S-boxes similar to AES S-boxes with the help of a private key belonging to the user and enable images to be encrypted or decrypted using S-boxes. This study consists of two stages: the dynamic generation of the S-box method and the encryption-decryption method. S-boxes should have a non-linear structure, and for this reason, K/DSA (Knutt Durstenfeld Shuffle Algorithm), which is one of the pseudo-random techniques, is used to generate S-boxes dynamically. The biggest advantage of this approach is the production of the inverted S-box with the S-box. Compared to the methods in the literature, the need to store the S-box is eliminated. Also, the fabr
... Show MoreThis paper aims to evaluate the reliability analysis for steel beam which represented by the probability of Failure and reliability index. Monte Carlo Simulation Method (MCSM) and First Order Reliability Method (FORM) will be used to achieve this issue. These methods need two samples for each behavior that want to study; the first sample for resistance (carrying capacity R), and second for load effect (Q) which are parameters for a limit state function. Monte Carlo method has been adopted to generate these samples dependent on the randomness and uncertainties in variables. The variables that consider are beam cross-section dimensions, material property, beam length, yield stress, and applied loads. Matlab software has be
... Show MoreIn this paper, we studied the scheduling of jobs on a single machine. Each of n jobs is to be processed without interruption and becomes available for processing at time zero. The objective is to find a processing order of the jobs, minimizing the sum of maximum earliness and maximum tardiness. This problem is to minimize the earliness and tardiness values, so this model is equivalent to the just-in-time production system. Our lower bound depended on the decomposition of the problem into two subprograms. We presented a novel heuristic approach to find a near-optimal solution for the problem. This approach depends on finding efficient solutions for two problems. The first problem is minimizing total completi
... Show More