Wireless Body Area Sensor Network (WBASN) is gaining significant attention due to its applications in smart health offering cost-effective, efficient, ubiquitous, and unobtrusive telemedicine. WBASNs face challenges including interference, Quality of Service, transmit power, and resource constraints. Recognizing these challenges, this paper presents an energy and Quality of Service-aware routing algorithm. The proposed algorithm is based on each node's Collaboratively Evaluated Value (CEV) to select the most suitable cluster head (CH). The Collaborative Value (CV) is derived from three factors, the node's residual energy, the distance vector between nodes and personal device, and the sensor's density in each CH. The CEV algorithm operates in the following manner: CHs are dynamically selected in each transmission round based on the nodes' CVs. The algorithm considered the patient's condition classification to guarantee safety and attain a response speed appropriate for their current state. So, data is categorized into Very-Critical, Critical, and Normal data classes using the supervised learning vector quantization (LVQ) classifier. Very Critical data is sent to the emergency center to dispatch an ambulance, Critical data is transmitted to a doctor, and Normal data is sent to a data center. This methodology promotes efficient and reliable intra-network communication, ensuring prompt and precise data transmission, and reducing frequent recharging. Comparative analyses reveal that the proposed algorithm outperforms ERRS (Energy-Efficient and Reliable Routing Scheme) and LEACH (low energy adaptive clustering hierarchy) regarding network longevity by 27% and 33%, augmenting network stability by 12% and 45% over the aforementioned protocols, respectively. The performance was conducted in OMNeT++ simulator
An intrusion detection system (IDS) is key to having a comprehensive cybersecurity solution against any attack, and artificial intelligence techniques have been combined with all the features of the IoT to improve security. In response to this, in this research, an IDS technique driven by a modified random forest algorithm has been formulated to improve the system for IoT. To this end, the target is made as one-hot encoding, bootstrapping with less redundancy, adding a hybrid features selection method into the random forest algorithm, and modifying the ranking stage in the random forest algorithm. Furthermore, three datasets have been used in this research, IoTID20, UNSW-NB15, and IoT-23. The results are compared with the three datasets men
... Show MoreThe aim of the current study is to identify the level of goal conflict with twelfth-grade students in South Sharqiah/ Sultanate of Oman according to gender and specialization. The study used the descriptive method. A scale of (28) items was developed and divided into six dimensions: time pressure, goal achievement, limit of power, limit of budget, incompatible strategies, and unclear task. To validate the scale, it was piloted (40) students. The scale was administered to a sample of (402) students (209) males in the Governorate of South Sharqiah. The results showed that the conflict level was high in “unclear task”, and an average conflict level in “limit of power”. Other dimensions (goal achievement, time pressure, limit of powe
... Show MoreMost recognition system of human facial emotions are assessed solely on accuracy, even if other performance criteria are also thought to be important in the evaluation process such as sensitivity, precision, F-measure, and G-mean. Moreover, the most common problem that must be resolved in face emotion recognition systems is the feature extraction methods, which is comparable to traditional manual feature extraction methods. This traditional method is not able to extract features efficiently. In other words, there are redundant amount of features which are considered not significant, which affect the classification performance. In this work, a new system to recognize human facial emotions from images is proposed. The HOG (Histograms of Or
... Show MoreThe increased size of grayscale images or upscale plays a central role in various fields such as medicine, satellite imagery, and photography. This paper presents a technique for improving upscaling gray images using a new mixing wavelet generation by tensor product. The proposed technique employs a multi-resolution analysis provided by a new mixing wavelet transform algorithm to decompose the input image into different frequency components. After processing, the low-resolution input image is effectively transformed into a higher-resolution representation by adding a zeroes matrix. Discrete wavelets transform (Daubechies wavelet Haar) as a 2D matrix is used but is mixed using tensor product with another wavelet matrix’s size. MATLAB R2021
... Show MoreThe interest in the intellectual capital and its development is a civilized necessity imposed by the requirements of the times and cannot imagine an advanced society in its potential productivity in poor efficiency of human capital, and features the work environment change permanently, putting the management of financial companies against a constant challenge toward coping with new developments in this changing environment and this is not taken unless owned by these companies qualified human resources and the provision of Culture organizers have, which manifested itself with the research problem by the following two questions:
- Did the intellectual capital value specific financial and
The demand for single photon sources in quantum key distribution (QKD) systems has necessitated the use of weak coherent pulses (WCPs) characterized by a Poissonian distribution. Ensuring security against eavesdropping attacks requires keeping the mean photon number (µ) small and known to legitimate partners. However, accurately determining µ poses challenges due to discrepancies between theoretical calculations and practical implementation. This paper introduces two experiments. The first experiment involves theoretical calculations of µ using several filters to generate the WCPs. The second experiment utilizes a variable attenuator to generate the WCPs, and the value of µ was estimated from the photons detected by the BB
... Show MoreA 3D geological model is an essential step to reveal reservoir heterogeneity and reservoir properties distribution. In the present study, a three-dimensional geological model for the Mishrif reservoir was built based on data obtained from seven wells and core data. The methodology includes building a 3D grid and populating it with petrophysical properties such as (facies, porosity, water saturation, and net to gross ratio). The structural model was built based on a base contour map obtained from 2D seismic interpretation along with well tops from seven wells. A simple grid method was used to build the structural framework with 234x278x91 grid cells in the X, Y, and Z directions, respectively, with lengths equal to 150 meters. The to
... Show MoreAmong many problems that reduced the performance of the network, especially Wide Area Network, congestion is one of these, which is caused when traffic request reaches or exceeds the available capacity of a route, resulting in blocking and less throughput per unit time. Congestion management attributes try to manage such cases. The work presented in this paper deals with an important issue that is the Quality of Service (QoS) techniques. QoS is the combination effect on service level, which locates the user's degree of contentment of the service. In this paper, packet schedulers (FIFO, WFQ, CQ and PQ) were implemented and evaluated under different applications with different priorities. The results show that WFQ scheduler gives acceptable r
... Show MoreNovel artificial neural network (ANN) model was constructed for calibration of a multivariate model for simultaneously quantitative analysis of the quaternary mixture composed of carbamazepine, carvedilol, diazepam, and furosemide. An eighty-four mixing formula where prepared and analyzed spectrophotometrically. Each analyte was formulated in six samples at different concentrations thus twentyfour samples for the four analytes were tested. A neural network of 10 hidden neurons was capable to fit data 100%. The suggested model can be applied for the quantitative chemical analysis for the proposed quaternary mixture.