Wireless Body Area Network (WBAN) is a tool that improves real-time patient health observation in hospitals, asylums, especially at home. WBAN has grown popularity in recent years due to its critical role and vast range of medical applications. Due to the sensitive nature of the patient information being transmitted through the WBAN network, security is of paramount importance. To guarantee the safe movement of data between sensor nodes and various WBAN networks, a high level of security is required in a WBAN network. This research introduces a novel technique named Integrated Grasshopper Optimization Algorithm with Artificial Neural Network (IGO-ANN) for distinguishing between trusted nodes in WBAN networks by means of a classification approach, hence strengthening the safety of such networks. Feature extraction process is done by using Linear Regression-Based Principal Component Analysis (LR-PCA). The test results demonstrated that the proposed IGO-ANN method attains the greatest performance in terms of accuracy, end to end delay and packet delivery ratio regarding trusted WBAN nodes classification than certain existing methods.
The present work is concerned with the finding of the optimum conditions for biochemical wastewater treatment for a local tannery. The water samples were taken from outline areas (the wastewater of the chrome and vegetable tannery) in equal volumes and subjected to sedimentation, biological treatment, and chemical and natural sedimentation treatment.
The Box-Wilson method of experimental design was adopted to find useful relationships between three operating variables that affect the treatment processes (temperature, aeration period and phosphate concentration) on the Biochemical Oxygen Demand (BOD5).
The experimental data collected by this method were successfully fitted to a second order polynomial mathematical model. The most fa
Drilling well design optimization reduces total Authorization for Expenditures (AFE) by decreasing well constructing time and expense. Well design is not a constant pattern during the life cycle of the field. It should be optimized by continuous improvements for all aspects of redesigning the well depending on the actual field conditions and problems. The core objective of this study is to deliver a general review of the well design optimization processes and the available studies and applications to employ the well design optimization to solve problems encountered with well design so that cost effectiveness and perfect drilling well performance are achievable. Well design optimization processes include unconventional design(slimhole) co
... Show MoreWe have studied Bayesian method in this paper by using the modified exponential growth model, where this model is more using to represent the growth phenomena. We focus on three of prior functions (Informative, Natural Conjugate, and the function that depends on previous experiments) to use it in the Bayesian method. Where almost of observations for the growth phenomena are depended on one another, which in turn leads to a correlation between those observations, which calls to treat such this problem, called Autocorrelation, and to verified this has been used Bayesian method.
The goal of this study is to knowledge the effect of Autocorrelation on the estimation by using Bayesian method. F
... Show MoreThe efficiency evaluation of the railway lines performance is done through a set of indicators and criteria, the most important are transport density, the productivity of enrollee, passenger vehicle production, the productivity of freight wagon, and the productivity of locomotives. This study includes an attempt to calculate the most important of these indicators which transport density index from productivity during the four indicators, using artificial neural network technology. Two neural networks software are used in this study, (Simulnet) and (Neuframe), the results of second program has been adopted. Training results and test to the neural network data used in the study, which are obtained from the international in
... Show MoreWhen optimizing the performance of neural network-based chatbots, determining the optimizer is one of the most important aspects. Optimizers primarily control the adjustment of model parameters such as weight and bias to minimize a loss function during training. Adaptive optimizers such as ADAM have become a standard choice and are widely used for their invariant parameter updates' magnitudes concerning gradient scale variations, but often pose generalization problems. Alternatively, Stochastic Gradient Descent (SGD) with Momentum and the extension of ADAM, the ADAMW, offers several advantages. This study aims to compare and examine the effects of these optimizers on the chatbot CST dataset. The effectiveness of each optimizer is evaluat
... Show More This research aims to estimate stock returns, according to the Rough Set Theory approach, test its effectiveness and accuracy in predicting stock returns and their potential in the field of financial markets, and rationalize investor decisions. The research sample is totaling (10) companies traded at Iraq Stock Exchange. The results showed a remarkable Rough Set Theory application in data reduction, contributing to the rationalization of investment decisions. The most prominent conclusions are the capability of rough set theory in dealing with financial data and applying it for forecasting stock returns.The research provides those interested in investing stocks in financial
... Show MoreMedicine is one of the fields where the advancement of computer science is making significant progress. Some diseases require an immediate diagnosis in order to improve patient outcomes. The usage of computers in medicine improves precision and accelerates data processing and diagnosis. In order to categorize biological images, hybrid machine learning, a combination of various deep learning approaches, was utilized, and a meta-heuristic algorithm was provided in this research. In addition, two different medical datasets were introduced, one covering the magnetic resonance imaging (MRI) of brain tumors and the other dealing with chest X-rays (CXRs) of COVID-19. These datasets were introduced to the combination network that contained deep lea
... Show MoreThe consensus algorithm is the core mechanism of blockchain and is used to ensure data consistency among blockchain nodes. The PBFT consensus algorithm is widely used in alliance chains because it is resistant to Byzantine errors. However, the present PBFT (Practical Byzantine Fault Tolerance) still has issues with master node selection that is random and complicated communication. The IBFT consensus technique, which is enhanced, is proposed in this study and is based on node trust value and BLS (Boneh-Lynn-Shacham) aggregate signature. In IBFT, multi-level indicators are used to calculate the trust value of each node, and some nodes are selected to take part in network consensus as a result of this calculation. The master node is chosen
... Show Moreتم في هذه الدراسة ، تزيين رقائق أكسيد الجرافين (GO) بجسيمات كوبلتيت النيكل النانوية NiCo2O4(NC) عن طريق الترسيب في الموقع ، وتم استخدام المتراكب المحضر (NC: GO) كسطح ماز لإزالة صبغة الميثيل الخضراء ( MG) من المحاليل المائية. تم التحقق من التغطية الناجحة لأوكسيد الجرافين بجزيئات كوبلتيت النيكل النانوية (NC) باستخدام دراسات FT-IR وحيود الأشعة السينية (XRD). كانت أحجام الجسيم
... Show More