Breast cancer is a heterogeneous disease characterized by molecular complexity. This research utilized three genetic expression profiles—gene expression, deoxyribonucleic acid (DNA) methylation, and micro ribonucleic acid (miRNA) expression—to deepen the understanding of breast cancer biology and contribute to the development of a reliable survival rate prediction model. During the preprocessing phase, principal component analysis (PCA) was applied to reduce the dimensionality of each dataset before computing consensus features across the three omics datasets. By integrating these datasets with the consensus features, the model's ability to uncover deep connections within the data was significantly improved. The proposed multimodal deep learning multigenetic features (MDL-MG) architecture incorporates a custom attention mechanism (CAM), bidirectional long short-term memory (BLSTM), and convolutional neural networks (CNNs). Additionally, the model was optimized to handle contrastive loss by extracting distinguishing features using a Siamese network (SN) architecture with a Euclidean distance metric. To assess the effectiveness of this approach, various evaluation metrics were applied to the cancer genome atlas (TCGA-BREAST) dataset. The model achieved 100% accuracy and demonstrated improvements in recall (16.2%), area under the curve (AUC) (29.3%), and precision (10.4%) while reducing complexity. These results highlight the model's efficacy in accurately predicting cancer survival rates.
In this paper, The transfer function model in the time series was estimated using different methods, including parametric Represented by the method of the Conditional Likelihood Function, as well as the use of abilities nonparametric are in two methods local linear regression and cubic smoothing spline method, This research aims to compare those capabilities with the nonlinear transfer function model by using the style of simulation and the study of two models as output variable and one model as input variable in addition t
... Show MoreThe denoising of a natural image corrupted by Gaussian noise is a problem in signal or image processing. Much work has been done in the field of wavelet thresholding but most of it was focused on statistical modeling of wavelet coefficients and the optimal choice of thresholds. This paper describes a new method for the suppression of noise in image by fusing the stationary wavelet denoising technique with adaptive wiener filter. The wiener filter is applied to the reconstructed image for the approximation coefficients only, while the thresholding technique is applied to the details coefficients of the transform, then get the final denoised image is obtained by combining the two results. The proposed method was applied by usin
... Show MoreA Novel artificial neural network (ANN) model was constructed for calibration of a multivariate model for simultaneously quantitative analysis of the quaternary mixture composed of carbamazepine, carvedilol, diazepam, and furosemide. An eighty-four mixing formula where prepared and analyzed spectrophotometrically. Each analyte was formulated in six samples at different concentrations thus twenty four samples for the four analytes were tested. A neural network of 10 hidden neurons was capable to fit data 100%. The suggested model can be applied for the quantitative chemical analysis for the proposed quaternary mixture.
The Present study investigated the drought in Iraq, by using the rainfall data which obtained from 39 meteorological stations for the past 30 years (1980-2010). The drought coefficient calculated on basis of the standard precipitation index (SPI) and then characteristics of drought magnitude, duration and intensity were analyzed. The correlation and regression between magnitude and duration of drought were obtained according the (SPI) index. The result shows that drought magnitude values were greater in the northeast region of Iraq.
Steganography is a technique of concealing secret data within other quotidian files of the same or different types. Hiding data has been essential to digital information security. This work aims to design a stego method that can effectively hide a message inside the images of the video file. In this work, a video steganography model has been proposed through training a model to hiding video (or images) within another video using convolutional neural networks (CNN). By using a CNN in this approach, two main goals can be achieved for any steganographic methods which are, increasing security (hardness to observed and broken by used steganalysis program), this was achieved in this work as the weights and architecture are randomized. Thus,
... Show MoreThe effect of the initial pressure upon the laminar flame speed, for a methane-air mixtures, has been detected paractically, for a wide range of equivalence ratio. In this work, a measurement system is designed in order to measure the laminar flame speed using a constant volume method with a thermocouples technique. The laminar burning velocity is measured, by using the density ratio method. The comparison of the present work results and the previous ones show good agreement between them. This indicates that the measurements and the calculations employed in the present work are successful and precise
Elemental capture spectroscopy (ECS) is an important tool in the petroleum industry for determining the composition and properties of rock formations in a reservoir. Knowledge of the types and abundance of different minerals in the reservoir is crucial for accurate petrophysical interpretation, reservoir engineering practices, and stratigraphic correlation. ECS measures the elemental content of the rock, which directly impacts several physical properties that are essential for reservoir characterization, such as porosity, fluid saturation, permeability, and matrix density. The ability to accurately determine these properties leads to better reservoir mapping, improved production, and more effective resource management. Accurately determi
... Show MoreCloud storage provides scalable and low cost resources featuring economies of scale based on cross-user architecture. As the amount of data outsourced grows explosively, data deduplication, a technique that eliminates data redundancy, becomes essential. The most important cloud service is data storage. In order to protect the privacy of data owner, data are stored in cloud in an encrypted form. However, encrypted data introduce new challenges for cloud data deduplication, which becomes crucial for data storage. Traditional deduplication schemes cannot work on encrypted data. Existing solutions of encrypted data deduplication suffer from security weakness. This paper proposes a combined compressive sensing and video deduplication to maximize
... Show MoreIn this work, we first construct Hermite wavelets on the interval [0,1) with it’s product, Operational matrix of integration 2^k M×2^k M is derived, and used it for solving nonlinear Variational problems with reduced it to a system of algebric equations and aid of direct method. Finally, some examples are given to illustrate the efficiency and performance of presented method.