Abstract: The utility of DNA sequencing in diagnosing and prognosis of diseases is vital for assessing the risk of genetic disorders, particularly for asymptomatic individuals with a genetic predisposition. Such diagnostic approaches are integral in guiding health and lifestyle decisions and preparing families with the necessary foreknowledge to anticipate potential genetic abnormalities. The present study explores implementing a define-by-run deep learning (DL) model optimized using the Tree-structured Parzen estimator algorithm to enhance the precision of genetic diagnostic tools. Unlike conventional models, the define-by-run model bolsters accuracy through dynamic adaptation to data during the learning process and iterative optimization of critical hyperparameters, such as layer count, neuron count per layer, learning rate, and batch size. Utilizing a diverse dataset comprising DNA sequences fromtwo distinct groups: patients diagnosed with breast cancer and a control group of healthy individuals. The model showcased remarkable performance, with accuracy, precision, recall, F1-score, and area under the curve metrics reaching 0.871, 0.872, 0.871, 0.872, and 0.95, respectively, outperforming previous models. These findings underscore the significant potential of DL techniques in amplifying the accuracy of disease diagnosis and prognosis through DNA sequencing, indicating substantial advancements in personalized medicine and genetic counseling. Collectively, the findings of this investigation suggest that DL presents transformative potential in the landscape of genetic disorder diagnosis and management.
Image classification is the process of finding common features in images from various classes and applying them to categorize and label them. The main problem of the image classification process is the abundance of images, the high complexity of the data, and the shortage of labeled data, presenting the key obstacles in image classification. The cornerstone of image classification is evaluating the convolutional features retrieved from deep learning models and training them with machine learning classifiers. This study proposes a new approach of “hybrid learning” by combining deep learning with machine learning for image classification based on convolutional feature extraction using the VGG-16 deep learning model and seven class
... Show MoreThe aim of this research is to assess the validity of Detailed Micro-Modeling (DMM) as a numerical model for masonry analysis. To achieve this aim, a set of load-displacement curves obtained based on both numerical simulation and experimental results of clay masonry prisms loaded by a vertical load. The finite element method was implemented in DMM for analysis of the experimental clay masonry prism. The finite element software ABAQUS with implicit solver was used to model and analyze the clay masonry prism subjected to a vertical load. The load-displacement relationship of numerical model was found in good agreement with those drawn from experimental results. Evidence shows that load-displacement curvefound from the finite element m
... Show MoreModern civilization increasingly relies on sustainable and eco-friendly data centers as the core hubs of intelligent computing. However, these data centers, while vital, also face heightened vulnerability to hacking due to their role as the convergence points of numerous network connection nodes. Recognizing and addressing this vulnerability, particularly within the confines of green data centers, is a pressing concern. This paper proposes a novel approach to mitigate this threat by leveraging swarm intelligence techniques to detect prospective and hidden compromised devices within the data center environment. The core objective is to ensure sustainable intelligent computing through a colony strategy. The research primarily focusses on the
... Show MoreTo maintain a sustained competitive position in the contemporary environment of knowledge economy, organizations as an open social systems must have an ability to learn and know how to adapt to rapid changes in a proper fashion so that organizational objectives will be achieved efficiently and effectively. A multilevel approach is adopted proposing that organizational learning suffers from the lack of interest about the strategic competitive performance of the organization. This remains implicit almost in all models of organizational learning and there is little focus on how learning organizations achieve sustainable competitive advantage . A dynamic model that captures t
... Show MoreThe present study deals with the strategies used in the Arabic translations of the most popular genres of children’s literature; namely fairy tales and fables as an attempt to identify the best methods and strategies to be adopted in translating these genres to fulfill the ultimate purpose of enriching the children’s knowledge in addition to attracting their interest and arousing the joy sought for in every piece of literature.
The study sets off from three dominating trends: the first calls for the adoption of domestication strategy of translation as the most appropriate and effective strategy in translation for children. In the same line, the second opposes using the foreignization strategy, w
... Show MoreEvery so often, a confluence of novel technologies emerges that radically transforms every aspect of the industry, the global economy, and finally, the way we live. These sharp leaps of human ingenuity are known as industrial revolutions, and we are currently in the midst of the fourth such revolution, coined Industry 4.0 by the World Economic Forum. Building on their guideline set of technologies that encompass Industry 4.0, we present a full set of pillar technologies on which Industry 4.0 project portfolio management rests as well as the foundation technologies that support these pillars. A complete model of an Industry 4.0 factory which relies on these pillar technologies is presented. The full set of pillars encompasses cyberph
... Show MoreIn study of effective bioactive compounds, we have synthesized the Co((ІІ), Mn(ІІ), Fe(ІІ), Cu(ІІ), Ni(ІІ), and Zn(ІІ) complexes of the Schiff base derived from trimethoprim and2'-amino-4-chlorobenzophenone and characterized by spectroscopic (NMR, IR, Mass, UV–vis,), analytical, TGA studies and magnetic data .The solution electronic spectral study suggests the stoichiometry of the synthesized complexes and Elemental analysis detected the square planer and octahedral geometry of the compounds. The prepared metal complexes presented promoted efficiency versus the screened bacterial (Escherichia Coli and Staphylococcus aureus) antibacterial efficacy against (Staphylococcus aureus, Salmonella spp., E. coli, Vibrio spp., Pseud
... Show MoreMixed ligand metal complexes are synthesized from oxalic acid with Schiff base, and the Schiff base was obtained from trimethoprim and acetylacetone. The synthesized complexes were of the type [M(L1)(L2)], where the metal, M, is Ni(II), Cu(II), Cr(III), and Zn(II), L1 corresponds to the trimethoprim ((Z)-4-((4-amino-5-(3,4,5- trimethoxybenzyl)pyrimidine-2-yl)imino)pentane-2-one) as the first ligand and L2 represent the oxalate anion (𝐶𝑂 ) as a second ligand. Characterization of the prepared compounds was performed by elemental analysis, molar conductivity, magnetic measurements, 1H-NMR, 13C-NMR, FT-IR, and Ultraviolet-visible (UV-Vis) spectral studies. The recorded infrared data is reinforced with density functional th
... Show MoreMixed ligand metal complexes are synthesized from oxalic acid with Schiff base, and the Schiff base was obtained from trimethoprim and acetylacetone. The synthesized complexes were of the type [M(L1)(L2)], where the metal, M, is Ni(II), Cu(II), Cr(III), and Zn(II), L1 corresponds to the trimethoprim ((Z)-4-((4-amino-5-(3,4,5-trimethoxybenzyl)pyrimidine-2-yl)imino)pentane-2-one) as the first ligand and L2 represent the oxalate anion ( ) as a second ligand. Characterization of the prepared compounds was performed by elemental analysis, molar conductivity, magnetic measurements, 1H-NMR, 13C-NMR, FT-IR, and Ultraviolet-visible (UV-Vis) spectral studies. The recorded infrared data is reinforced with density functional theory (DFT) calcul
... Show MoreThe increase globally fossil fuel consumption as it represents the main source of energy around the world, and the sources of heavy oil more than light, different techniques were used to reduce the viscosity and increase mobility of heavy crude oil. this study focusing on the experimental tests and modeling with Back Feed Forward Artificial Neural Network (BFF-ANN) of the dilution technique to reduce a heavy oil viscosity that was collected from the south- Iraq oil fields using organic solvents, organic diluents with different weight percentage (5, 10 and 20 wt.% ) of (n-heptane, toluene, and a mixture of different ratio
... Show More