Wireless Sensor Networks (WSNs) are promoting the spread of the Internet for devices in all areas of
life, which makes it is a promising technology in the future. In the coming days, as attack technologies become
more improved, security will have an important role in WSN. Currently, quantum computers pose a significant
risk to current encryption technologies that work in tandem with intrusion detection systems because it is
difficult to implement quantum properties on sensors due to the resource limitations. In this paper, quantum
computing is used to develop a future-proof, robust, lightweight and resource-conscious approach to sensor
networks. Great emphasis is placed on the concepts of using the BB84 protocol with the AES algorithm in
WSN security. The results of analysis indicated a high level of security between the data by depending on the
generation of secure keys, and reached an accuracy rate of about (80-95) % based on using NIST statistical.
The efficiency of the work increased to 0.704 after using the Quantum Bit Error Rate equation, eventually
increasing the network performance. This results in the reduction of the overall amount of energy, and the time
required for performing the key exchange in the encryption and decryption processes decreased.
The study aims to investigate the effect of Al2O3 and Al additions to Nickel-base superalloys as a coating layer on oxidation resistance, and structural behavior of nickel superalloys such as IN 738 LC. Nickel-base superalloys are popular as base materials for hot components in industrial gas turbines such as blades due to their superior mechanical performance and high-temperature oxidation resistance, but the combustion gases' existence generates hot oxidation at high temperatures for long durations of time, resulting in corrosion of turbine blades which lead to massive economic losses. Turbine blades used in Iraqi electrical gas power stations require costly maintenance using traditional processes regularly. These blades are made
... Show MoreThe main parameter that drives oil industry contract investment and set up economic feasibility study for approving field development plan is hydrocarbon reservoir potential. So a qualified experience should be deeply afforded to correctly evaluate hydrocarbons reserve by applying different techniques at each phase of field management, through collecting and using valid and representative data sources, starting from exploration phase and tune-up by development phase. Commonly, volumetric calculation is the main technique for estimate reservoir potential using available information at exploration stage which is quite few data; in most cases, this technique estimate big figure of reserve. In this study
In this paper we proposed the method of X-ray fluorescence (XRF) determination of some essential trace elements in medicinal herbs and vitamin-mineral complexes at the level of 100-101 mg/ml. To increase sensitivity and selectivity of the determination we simple and effective approach based on the extraction of metal ions from aqueous solutions with chemically modified polyurethane foam sorbents followed by direct XRF analysis. The conditions of sorption preconcentration of Co(II), Ni(II) and Zn(II) ions with modified sorbents were optimized. The proposed approach is used for the determination of trace elements in several kinds of medicinal herbs (coltsfoot leaves, nettle leaves and yarrow herb) and vitamin-mineral
... Show MoreThe effects of using aqueous nanofluids containing covalently functionalized graphene nanoplatelets with triethanolamine (TEA-GNPs) as novel working fluids on the thermal performance of a flat-plate solar collector (FPSC) have been investigated. Water-based nanofluids with weight concentrations of 0.025%, 0.05%, 0.075%, and 0.1% of TEA-GNPs with specific surface areas of 300, 500, and 750 m2/g were prepared. An experimental setup was designed and built and a simulation program using MATLAB was developed. Experimental tests were performed using inlet fluid temperatures of 30, 40, and 50 °C; flow rates of 0.6, 1.0, and 1.4 kg/min; and heat flux intensities of 600, 800, and 1000 W/m2. The FPSC’s efficiency increased as the flow rate and hea
... Show MoreAbstract
This study aims to identify the reality of using electronic applications in teaching language skills to people with mild intellectual disabilities from the mothers’ perspective. A descriptive approach was used. The electronic questionnaires were administered to the study sample, 122 responses were received from mothers of the students with mild intellectual disability in Hafer Al-Baten schools. The response average rate was 94%. The results showed that there are statistically significant differences that are related to the variant of monthly income as for the barriers to using electronic applications in such schools, whereas there were no differences regarding the variant of monthly income regarding t
... Show MoreA phytoremediation experiment was carried out with kerosene as a model for total petroleum hydrocarbons. A constructed wetland of barley was exposed to kerosene pollutants at varying concentrations (1, 2, and 3% v/v) in a subsurface flow (SSF) system. After a period of 42 days of exposure, it was found that the average ability to eliminate kerosene ranged from 56.5% to 61.2%, with the highest removal obtained at a kerosene concentration of 1% v/v. The analysis of kerosene at varying initial concentrations allowed the kinetics of kerosene to be fitted with the Grau model, which was closer than that with the zero order, first order, or second order kinetic models. The experimental study showed that the barley plant designed in a subsu
... Show MoreResearchers have increased interest in recent years in determining the optimum sample size to obtain sufficient accuracy and estimation and to obtain high-precision parameters in order to evaluate a large number of tests in the field of diagnosis at the same time. In this research, two methods were used to determine the optimum sample size to estimate the parameters of high-dimensional data. These methods are the Bennett inequality method and the regression method. The nonlinear logistic regression model is estimated by the size of each sampling method in high-dimensional data using artificial intelligence, which is the method of artificial neural network (ANN) as it gives a high-precision estimate commensurate with the dat
... Show More