In this work, plasma parameters such as (electron temperature (Te), electron density (ne), plasma frequency (fp) and Debye length (λD)) were studied using spectral analysis techniques. The spectrum of the plasma was recorded with different energy values, SnO2 and ZnO anesthetized at a different ratio (X = 0.2, 0.4 and 0.6) were recorded. Spectral study of this mixing in the air. The results showed electron density and electron temperature increase in zinc oxide: tin oxide alloy targets. It was located that The intensity of the lines increases in different laser peak powers when the laser peak power increases and then decreases when the force continues to increase.
The Elliptic Curve Cryptography (ECC) algorithm meets the requirements for multimedia encryption since the encipher operation of the ECC algorithm is applied at points only and that offer significant computational advantages. The encoding/decoding operations for converting the text message into points on the curve and vice versa are not always considered a simple process. In this paper, a new mapping method has been investigated for converting the text message into a point on the curve or point to a text message in an efficient and secure manner; it depends on the repeated values in coordinate to establish a lookup table for encoding/ decoding operations. The proposed method for mapping process is&
... Show MoreIn this paper the wind data that is measured for 12 months (January to December 2011) at Al-Hay district of Wasit province, southern IRAQ country has been analyzed statistically. The wind speed at heights of 10 m above ground level was measured for every 10 minutes interval. The statistical analysis of wind data was performed using WAsP software which is based on Weibull distributions. The Weibull shape and scale parameters is obtained and used in this paper statistics. The achieved results demonstrated that the study area has Annual Mean Energy Production (AMEP) about 219.002 MWh. The computations have been performed on 70m hub‟s height of the turbine and on Earth surface roughness length (0.0, 0.03, 0.1, 0.4, 1.5) m respectively.
Corpus linguistics is a methodology in studying language through corpus-based research. It differs from a traditional approach in studying a language (prescriptive approach) in its insistence on the systematic study of authentic examples of language in use (descriptive approach).A “corpus” is a large body of machine-readable structurally collected naturally occurring linguistic data, either written texts or a transcription of recorded speech, which can be used as a starting-point of linguistic description or as a means of verifying hypotheses about a language. In the past decade, interest has grown tremendously in the use of language corpora for language education. The ways in which corpora have been employed in language pedago
... Show MoreIn the present work a dynamic analysis technique have been developed to investigate and characterize the quantity of elastic module degradation of cracked cantilever plates due to presence of a defect such as surface of internal crack under free vibration. A new generalized technique represents the first step in developing a health monitoring system, the effects of such defects on the modal frequencies has been the main key quantifying the elasticity modulii due to presence any type of un-visible defect. In this paper the finite element method has been used to determine the free vibration characteristics for cracked cantilever plate (internal flaws), this present work achieved by different position of crack. Stiffness re
... Show MoreInstitutions and companies are looking to reduce spending on buildings and services according to scientific methods, provided they reach the same purpose but at a lower cost. On this basis, this paper proposes a model to measure and reduce maintenance costs in one of the public sector institutions in Iraq by using performance indicators that fit the nature of the work of this institution and the available data. The paper relied on studying the nature of the institution’s work in the maintenance field and looking at the type of data available to know the type and number of appropriate indicators to create the model. Maintenance data were collected for the previous six years by reviewing the maintenance and financial dep
... Show MoreIn this study, we design narrow band pass filter for window (3_5) ?m dependent on the needle optimization method , and a comparison with global designs published -Also, the effect of change parameter design on the optical performance of filter was studded and being able to overcome the difficulties of the design.In this study, the adoption of homogeneous optical properties materials as thin film depositing on a substrate of germanium at wavelength design (? = 4 ?m). For design this kind of filters we used advanced computer program (Matlab )to build a model design dependent both matrix characteristic and Needle technique. In this paper we refer to the type of Mert function , which is used for correct optical performance acces
... Show MoreThis research is devoted to design and implement a Supervisory Control and Data Acquisition system (SCADA) for monitoring and controlling the corrosion of a carbon steel pipe buried in soil. A smart technique equipped with a microcontroller, a collection of sensors and a communication system was applied to monitor and control the operation of an ICCP process for a carbon steel pipe. The integration of the built hardware, LabVIEW graphical programming and PC interface produces an effective SCADA system for two types of control namely: a Proportional Integral Derivative (PID) that supports a closed loop, and a traditional open loop control. Through this work, under environmental temperature of 30°C, an evaluation and comparison were done for
... Show MoreSoftware-defined networking (SDN) presents novel security and privacy risks, including distributed denial-of-service (DDoS) attacks. In response to these threats, machine learning (ML) and deep learning (DL) have emerged as effective approaches for quickly identifying and mitigating anomalies. To this end, this research employs various classification methods, including support vector machines (SVMs), K-nearest neighbors (KNNs), decision trees (DTs), multiple layer perceptron (MLP), and convolutional neural networks (CNNs), and compares their performance. CNN exhibits the highest train accuracy at 97.808%, yet the lowest prediction accuracy at 90.08%. In contrast, SVM demonstrates the highest prediction accuracy of 95.5%. As such, an
... Show MoreThe effect of the initial pressure upon the laminar flame speed, for a methane-air mixtures, has been detected paractically, for a wide range of equivalence ratio. In this work, a measurement system is designed in order to measure the laminar flame speed using a constant volume method with a thermocouples technique. The laminar burning velocity is measured, by using the density ratio method. The comparison of the present work results and the previous ones show good agreement between them. This indicates that the measurements and the calculations employed in the present work are successful and precise
Self-repairing technology based on micro-capsules is an efficient solution for repairing cracked cementitious composites. Self-repairing based on microcapsules begins with the occurrence of cracks and develops by releasing self-repairing factors in the cracks located in concrete. Based on previous comprehensive studies, this paper provides an overview of various repairing factors and investigative methodologies. There has recently been a lack of consensus on the most efficient criteria for assessing self-repairing based on microcapsules and the smart solutions for improving capsule survival ratios during mixing. The most commonly utilized self-repairing efficiency assessment indicators are mechanical resistance and durab
... Show More