Breast cancer is a heterogeneous disease characterized by molecular complexity. This research utilized three genetic expression profiles—gene expression, deoxyribonucleic acid (DNA) methylation, and micro ribonucleic acid (miRNA) expression—to deepen the understanding of breast cancer biology and contribute to the development of a reliable survival rate prediction model. During the preprocessing phase, principal component analysis (PCA) was applied to reduce the dimensionality of each dataset before computing consensus features across the three omics datasets. By integrating these datasets with the consensus features, the model's ability to uncover deep connections within the data was significantly improved. The proposed multimodal deep learning multigenetic features (MDL-MG) architecture incorporates a custom attention mechanism (CAM), bidirectional long short-term memory (BLSTM), and convolutional neural networks (CNNs). Additionally, the model was optimized to handle contrastive loss by extracting distinguishing features using a Siamese network (SN) architecture with a Euclidean distance metric. To assess the effectiveness of this approach, various evaluation metrics were applied to the cancer genome atlas (TCGA-BREAST) dataset. The model achieved 100% accuracy and demonstrated improvements in recall (16.2%), area under the curve (AUC) (29.3%), and precision (10.4%) while reducing complexity. These results highlight the model's efficacy in accurately predicting cancer survival rates.
In cognitive radio networks, there are two important probabilities; the first probability is important to primary users called probability of detection as it indicates their protection level from secondary users, and the second probability is important to the secondary users called probability of false alarm which is used for determining their using of unoccupied channel. Cooperation sensing can improve the probabilities of detection and false alarm. A new approach of determine optimal value for these probabilities, is supposed and considered to face multi secondary users through discovering an optimal threshold value for each unique detection curve then jointly find the optimal thresholds. To get the aggregated throughput over transmission
... Show MoreSmart water flooding (low salinity water flooding) was mainly invested in a sandstone reservoir. The main reasons for using low salinity water flooding are; to improve oil recovery and to give a support for the reservoir pressure.
In this study, two core plugs of sandstone were used with different permeability from south of Iraq to explain the effect of water injection with different ions concentration on the oil recovery. Water types that have been used are formation water, seawater, modified low salinity water, and deionized water.
The effects of water salinity, the flow rate of water injected, and the permeability of core plugs have been studied in order to summarize the best conditions of low salinity
... Show MoreMany tools and techniques have been recently adopted to develop construction materials that are less harmful and friendlier to the environment. New products can be achieved through the recycling of waste material. Thus, this study aims to use recycled glass bottles as sustainable materials.
Our challenge is to use nano glass powder by the addition or replacement of the weight of the cement for producing concrete with enhanced strength.
A nano recycled glass p
The planning, designing, construction of excavations and foundations in soft to very soft clay soils are always difficult. They are problematic soil that caused trouble for the structures built on them because of the low shear strength, high water content, and high compressibility. This work investigates the geotechnical behavior of soft clay by using tyre ash material burnt in air. The investigation contains the following tests: physical tests, chemical tests, consolidation test, Compaction tests, shear test, California Bearing Ratio test CBR, and model tests. These tests were done on soil samples prepared from soft clay soil; tyre ash was used in four percentages (2, 4, 6, and 8%). The results of the tests were; The soil samples which
... Show MoreThis investigation was carried out to study the treatment and recycling of wastewater in the cotton textile industry for an effluent containing three dyes: direct blue, sulphur black and vat yellow. The reuse of such effluent can only be made possible by appropriate treatment method such as chemical coagulation. Ferrous and ferric sulphate with and without calcium hydroxide were employed in this study as the chemical coagulants.
The results showed that the percentage removal of direct blue ranged between 91.4 and 94 , for sulphur black ranged between 98.7 and 99.5 while for vat yellow it was between 97 and 99.
Investigating the human mobility patterns is a highly interesting field in the 21th century, and it takes vast attention from multi-disciplinary scientists in physics, economic, social, computer, engineering…etc. depending on the concept that relates between human mobility patterns and their communications. Hence, the necessity for a rich repository of data has emerged. Therefore, the most powerful solution is the usage of GSM network data, which gives millions of Call Details Records gained from urban regions. However, the available data still have shortcomings, because it gives only the indication of spatio-temporal data at only the moment of mobile communication activities. In th
Methods of speech recognition have been the subject of several studies over the past decade. Speech recognition has been one of the most exciting areas of the signal processing. Mixed transform is a useful tool for speech signal processing; it is developed for its abilities of improvement in feature extraction. Speech recognition includes three important stages, preprocessing, feature extraction, and classification. Recognition accuracy is so affected by the features extraction stage; therefore different models of mixed transform for feature extraction were proposed. The properties of the recorded isolated word will be 1-D, which achieve the conversion of each 1-D word into a 2-D form. The second step of the word recognizer requires, the
... Show MoreSteganography can be defined as the art and science of hiding information in the data that could be read by computer. This science cannot recognize stego-cover and the original one whether by eye or by computer when seeing the statistical samples. This paper presents a new method to hide text in text characters. The systematic method uses the structure of invisible character to hide and extract secret texts. The creation of secret message comprises four main stages such using the letter from the original message, selecting the suitable cover text, dividing the cover text into blocks, hiding the secret text using the invisible character and comparing the cover-text and stego-object. This study uses an invisible character (white space
... Show MoreFraud Includes acts involving the exercise of deception by multiple parties inside and outside companies in order to obtain economic benefits against the harm to those companies, as they are to commit fraud upon the availability of three factors which represented by the existence of opportunities, motivation, and rationalization. Fraud detecting require necessity of indications the possibility of its existence. Here, Benford’s law can play an important role in direct the light towards the possibility of the existence of financial fraud in the accounting records of the company, which provides the required effort and time for detect fraud and prevent it.