The dynamic development of computer and software technology in recent years was accompanied by the expansion and widespread implementation of artificial intelligence (AI) based methods in many aspects of human life. A prominent field where rapid progress was observed are high‐throughput methods in biology that generate big amounts of data that need to be processed and analyzed. Therefore, AI methods are more and more applied in the biomedical field, among others for RNA‐protein binding sites prediction, DNA sequence function prediction, protein‐protein interaction prediction, or biomedical image classification. Stem cells are widely used in biomedical research, e.g., leukemia or other disease studies. Our proposed approach of Deep Bayesian Neural Network (DBNN) for the personalized treatment of leukemia cancer has shown a significant tested accuracy for the model. DBNNs used in this study was able to classify images with accuracy exceeding 98.73%. This study depicts that the DBNN can classify cell cultures only based on unstained light microscope images which allow their further use. Therefore, building a bayesian‐based model to great help during commercial cell culturing, and possibly a first step in the process of creating an automated/semiautomated neural network‐based model for classification of good and bad quality cultures when images of such will be available.
Artificial pancreas is simulated to handle Type I diabetic patients under intensive care by automatically controlling the insulin infusion rate. A Backstepping technique is used to apply the effect of PID controller to blood glucose level since there is no direct relation between insulin infusion (the manipulated variable) and glucose level in Bergman’s system model subjected to an oral glucose tolerance test by applying a meal translated into a disturbance. Backstepping technique is usually recommended to stabilize and control the states of Bergman's class of nonlinear systems. The results showed a very satisfactory behavior of glucose deviation to a sudden rise represented by the meal that increase the blood glucose
... Show More
The Fourth Industrial Revolution represents an advanced stage of technological development, characterized by the integration of digital, physical, and biological technologies, with a strong focus on smart connectivity and advanced data analysis. At the core of this revolution stands Artificial Intelligence (AI), which enables the processing of vast amounts of data, decision-making with speed and accuracy, automation of processes, and enhancement of productivity and quality. This research examines the transformative role of AI in the humanities, particularly in archaeological, historical, and geographical studies, where traditional methods face limitations in handling complex and extensive datasets.The study aims to highlight these l
... Show MoreEstimating the semantic similarity between short texts plays an increasingly prominent role in many fields related to text mining and natural language processing applications, especially with the large increase in the volume of textual data that is produced daily. Traditional approaches for calculating the degree of similarity between two texts, based on the words they share, do not perform well with short texts because two similar texts may be written in different terms by employing synonyms. As a result, short texts should be semantically compared. In this paper, a semantic similarity measurement method between texts is presented which combines knowledge-based and corpus-based semantic information to build a semantic network that repre
... Show MoreThe utilization of artificial intelligence techniques has garnered significant interest in recent research due to their pivotal role in enhancing the quality of educational offerings. This study investigated the impact of employing artificial intelligence techniques on improving the quality of educational services, as perceived by students enrolled in the College of Pharmacy at the University of Baghdad. The study sample comprised 379 male and female students. A descriptive-analytical approach was used, with a questionnaire as the primary tool for data collection. The findings indicated that the application of artificial intelligence methods was highly effective, and the educational services provided to students were of exceptional
... Show MoreFace recognition is a crucial biometric technology used in various security and identification applications. Ensuring accuracy and reliability in facial recognition systems requires robust feature extraction and secure processing methods. This study presents an accurate facial recognition model using a feature extraction approach within a cloud environment. First, the facial images undergo preprocessing, including grayscale conversion, histogram equalization, Viola-Jones face detection, and resizing. Then, features are extracted using a hybrid approach that combines Linear Discriminant Analysis (LDA) and Gray-Level Co-occurrence Matrix (GLCM). The extracted features are encrypted using the Data Encryption Standard (DES) for security
... Show MoreFocal adhesion kinase (FAK), ephrin receptor type A4 (EphA4), and adiponectin (ADPN) are important indicators in inflammation, tumor growth, migration, and angiogenesis in some cancers. The predictive impact of their concentrations in acute myeloid leukemia (AML) patients to be identified remains. The research sought to explore the effect of FAK, EphA4, and ADPN as prognostic biomarkers, and their influence on patient survival, and to look for any potential correlation between their levels with hematological parameters in AML patients.
In this research, the semiparametric Bayesian method is compared with the classical method to estimate reliability function of three systems : k-out of-n system, series system, and parallel system. Each system consists of three components, the first one represents the composite parametric in which failure times distributed as exponential, whereas the second and the third components are nonparametric ones in which reliability estimations depend on Kernel method using two methods to estimate bandwidth parameter h method and Kaplan-Meier method. To indicate a better method for system reliability function estimation, it has be
... Show MoreSupport vector machines (SVMs) are supervised learning models that analyze data for classification or regression. For classification, SVM is widely used by selecting an optimal hyperplane that separates two classes. SVM has very good accuracy and extremally robust comparing with some other classification methods such as logistics linear regression, random forest, k-nearest neighbor and naïve model. However, working with large datasets can cause many problems such as time-consuming and inefficient results. In this paper, the SVM has been modified by using a stochastic Gradient descent process. The modified method, stochastic gradient descent SVM (SGD-SVM), checked by using two simulation datasets. Since the classification of different ca
... Show MoreThis article proposes a new technique for determining the rate of contamination. First, a generative adversarial neural network (ANN) parallel processing technique is constructed and trained using real and secret images. Then, after the model is stabilized, the real image is passed to the generator. Finally, the generator creates an image that is visually similar to the secret image, thus achieving the same effect as the secret image transmission. Experimental results show that this technique has a good effect on the security of secret information transmission and increases the capacity of information hiding. The metric signal of noise, a structural similarity index measure, was used to determine the success of colour image-hiding t
... Show More