Breast cancer is a heterogeneous disease characterized by molecular complexity. This research utilized three genetic expression profiles—gene expression, deoxyribonucleic acid (DNA) methylation, and micro ribonucleic acid (miRNA) expression—to deepen the understanding of breast cancer biology and contribute to the development of a reliable survival rate prediction model. During the preprocessing phase, principal component analysis (PCA) was applied to reduce the dimensionality of each dataset before computing consensus features across the three omics datasets. By integrating these datasets with the consensus features, the model's ability to uncover deep connections within the data was significantly improved. The proposed multimodal deep learning multigenetic features (MDL-MG) architecture incorporates a custom attention mechanism (CAM), bidirectional long short-term memory (BLSTM), and convolutional neural networks (CNNs). Additionally, the model was optimized to handle contrastive loss by extracting distinguishing features using a Siamese network (SN) architecture with a Euclidean distance metric. To assess the effectiveness of this approach, various evaluation metrics were applied to the cancer genome atlas (TCGA-BREAST) dataset. The model achieved 100% accuracy and demonstrated improvements in recall (16.2%), area under the curve (AUC) (29.3%), and precision (10.4%) while reducing complexity. These results highlight the model's efficacy in accurately predicting cancer survival rates.
Among the metaheuristic algorithms, population-based algorithms are an explorative search algorithm superior to the local search algorithm in terms of exploring the search space to find globally optimal solutions. However, the primary downside of such algorithms is their low exploitative capability, which prevents the expansion of the search space neighborhood for more optimal solutions. The firefly algorithm (FA) is a population-based algorithm that has been widely used in clustering problems. However, FA is limited in terms of its premature convergence when no neighborhood search strategies are employed to improve the quality of clustering solutions in the neighborhood region and exploring the global regions in the search space. On the
... Show MoreDue to the continuing demand for larger bandwidth, the optical transport becoming general in the access network. Using optical fiber technologies, the communications infrastructure becomes powerful, providing very high speeds to transfer a high capacity of data. Existing telecommunications infrastructures is currently widely used Passive Optical Network that apply Wavelength Division Multiplexing (WDM) and is awaited to play an important role in the future Internet supporting a large diversity of services and next generation networks. This paper presents a design of WDM-PON network, the simulation and analysis of transmission parameters in the Optisystem 7.0 environment for bidirectional traffic. The simulation shows the behavior of optical
... Show MoreAbstract
Due to the continuing demand for larger bandwidth, the optical transport becoming general in the access network. Using optical fiber technologies, the communications infrastructure becomes powerful, providing very high speeds to transfer a high capacity of data. Existing telecommunications infrastructures is currently widely used Passive Optical Network that apply Wavelength Division Multiplexing (WDM) and is awaited to play an important role in the future Internet supporting a large diversity of services and next generation networks. This paper presents a design of WDM-PON network, the simulation and analysis of transmission parameters in the Optisystem 7.0 environment for bidirectional traffic. The sim
... Show MoreThe complexity of multimedia contents is significantly increasing in the current world. This leads to an exigent demand for developing highly effective systems to satisfy human needs. Until today, handwritten signature considered an important means that is used in banks and businesses to evidence identity, so there are many works tried to develop a method for recognition purpose. This paper introduced an efficient technique for offline signature recognition depending on extracting the local feature by utilizing the haar wavelet subbands and energy. Three different sets of features are utilized by partitioning the signature image into non overlapping blocks where different block sizes are used. CEDAR signature database is used as a dataset f
... Show MoreAn approach is depended in the recent years to distinguish any author or writer from other by analyzing his writings or essays. This is done by analyzing the syllables of writings of an author. The syllable is composed of two letters; therefore the words of the writing are fragmented to syllables and extract the most frequency syllables to become trait of that author. The research work depend on analyzed the frequency syllables in two cases, the first, when there is a space between the words, the second, when these spaces are ignored. The results is obtained from a program which scan the syllables in the text file, the performance is best in the first case since the sequence of the selected syllables is higher than the same syllables in
... Show MoreIn this paper, method of steganography in Audio is introduced for hiding secret data in audio media file (WAV). Hiding in audio becomes a challenging discipline, since the Human Auditory System is extremely sensitive. The proposed method is to embed the secret text message in frequency domain of audio file. The proposed method contained two stages: the first embedding phase and the second extraction phase. In embedding phase the audio file transformed from time domain to frequency domain using 1-level linear wavelet decomposition technique and only high frequency is used for hiding secreted message. The text message encrypted using Data Encryption Standard (DES) algorithm. Finally; the Least Significant bit (LSB) algorithm used to hide secr
... Show MoreUsed automobile oils were subjected to filtration to remove solid material and dehydration to remove water, gasoline and light components by using vacuum distillation under moderate pressure, and then the dehydrated waste oil is subjected to extraction by using liquid solvents. Two solvents, namely n-butanol and n-hexane were used to extract base oil from automobile used oil, so that the expensive base oil can be reused again.
The recovered base oil by using n-butanol solvent gives (88.67%) reduction in carbon residue, (75.93%) reduction in ash content, (93.73%) oil recovery, (95%) solvent recovery and (100.62) viscosity index, at (5:1) solvent to used oil ratio and (40 oC) extraction temperature, while using n-hexane solvent gives (6