Objective: Evaluation the national standards for exposure to chemical materials and dusts in The State
Company for Drugs Industry in Samarra.
Methodology: A descriptive evaluation design is employed through the present study from 25th May 2011
to 30th November 2011 in order to evaluate the national standards for exposure chemical materials and dusts
in The State Company for Drugs Industry in Samarra. A purposive (non-probability) sample is selected for the
study which includes (110) workers from the State Company for Drugs Industry in Samarra. Data were
gathered through the workers` interviewed according to the nature of work that they perform. The evaluation
questionnaire comprised of three parts which include the workers` demographic characteristic and other two
part which concern the national standards for exposure to chemical materials and dusts in workplace.
Reliability and validity of this tool is determined through application of a pilot study and panel of experts. Data
were analyzed through the application of descriptive statistical (frequencies and percentages) and inferential
statistical (mean of score).
Results: The findings of the study present that the national standards for exposure chemical materials and
standards for exposure to dusts that are applicable in the workplace, can be adopted as national
standards. So, there is no significant impact of occupational hazards that may affect workers and work
environment as a result of applicable of this standards.
Recommendations: The study recommends that increase awareness, training and health education
programs should be provided for all workers regularly and periodically in order to help them comply
with standards for exposure chemical materials and standards for exposure to dusts in order to avoid
hazards that affecting their health and work environment.
In this work, a simple and new method is proposed to simultaneously improve the physical layer security and the transmission performance of the optical orthogonal frequency division multiplexing system, by combining orthogonal frequency division multiplexing technique with chaotic theory principles. In the system, a 2-D chaotic map is employed. The introduced system replaces complex operations such as matrix multiplication with simple operations such as multiplexing and inverting. The system performance in terms of bit error rate (BER) and peak to average ratio (PAPR) is enhanced. The system is simulated using Optisystem15 with a MATLAB2016 and for different constellations. The simulation results showed that the BE
... Show MoreIn the present research, a crane frame has been investigated by using finite element method. The damage is simulated by reducing the stiffness of assumed elements with ratios (10% and 20 %) in mid- span of the vertical column in crane frame. The cracked beam with a one-edge and non-propagating crack has been used. Six cases of damage are modeled for crane frame and by introducing cracked elements at different locations with ratio of depth of crack to the height of the beam (a/h) 0.1, 0.20. A FEM program coded in Matlab 6.5 was used to model the numerical simulation of the damage scenarios. The results showed a decreasing in the five natural frequencies from undamaged beam which means
... Show More Today, the use of iris recognition is expanding globally as the most accurate and reliable biometric feature in terms of uniqueness and robustness. The motivation for the reduction or compression of the large databases of iris images becomes an urgent requirement. In general, image compression is the process to remove the insignificant or redundant information from the image details, that implicitly makes efficient use of redundancy embedded within the image itself. In addition, it may exploit human vision or perception limitations to reduce the imperceptible information.
This paper deals with reducing the size of image, namely reducing the number of bits required in representing the
The growth curves of the children are the most commonly used tools to assess the general welfare of society. Particularity child being one of the pillars to develop society; through these tools, we can path a child's growth physiology. The Centile line is of the important tools to build these curves, which give an accurate interpretation of the information society, also respond with illustration variable age. To build standard growth curves for BMI, we use BMI as an index. LMSP method used for finding the Centile line which depends on four curves represents Median, Coefficient of Variation, Skews, and Kurtosis. These can be obtained by modeling four parameters as nonparametric Smoothing functions for the illustration variable. Ma
... Show MoreScheduling Timetables for courses in the big departments in the universities is a very hard problem and is often be solved by many previous works although results are partially optimal. This work implements the principle of an evolutionary algorithm by using genetic theories to solve the timetabling problem to get a random and full optimal timetable with the ability to generate a multi-solution timetable for each stage in the collage. The major idea is to generate course timetables automatically while discovering the area of constraints to get an optimal and flexible schedule with no redundancy through the change of a viable course timetable. The main contribution in this work is indicated by increasing the flexibility of generating opti
... Show MoreDecision making is vital and important activity in field operations research ,engineering ,administration science and economic science with any industrial or service company or organization because the core of management process as well as improve him performance . The research includes decision making process when the objective function is fraction function and solve models fraction programming by using some fraction programming methods and using goal programming method aid programming ( win QSB )and the results explain the effect use the goal programming method in decision making process when the objective function is
fraction .
In this paper Volterra Runge-Kutta methods which include: method of order two and four will be applied to general nonlinear Volterra integral equations of the second kind. Moreover we study the convergent of the algorithms of Volterra Runge-Kutta methods. Finally, programs for each method are written in MATLAB language and a comparison between the two types has been made depending on the least square errors.
In this work, results from an optical technique (laser speckle technique) for measuring surface roughness was done by using statistical properties of speckle pattern from the point of view of computer image texture analysis. Four calibration relationships were used to cover wide range of measurement with the same laser speckle technique. The first one is based on intensity contrast of the speckle, the second is based on analysis of speckle binary image, the third is on size of speckle pattern spot, and the latest one is based on characterization of the energy feature of the gray level co-occurrence matrices for the speckle pattern. By these calibration relationships surface roughness of an object surface can be evaluated within the
... Show MoreHiding technique for dynamic encryption text using encoding table and symmetric encryption method (AES algorithm) is presented in this paper. The encoding table is generated dynamically from MSB of the cover image points that used as the first phase of encryption. The Harris corner point algorithm is applied on cover image to generate the corner points which are used to generate dynamic AES key to second phase of text encryption. The embedded process in the LSB for the image pixels except the Harris corner points for more robust. Experimental results have demonstrated that the proposed scheme have embedding quality, error-free text recovery, and high value in PSNR.
In this paper we used frequentist and Bayesian approaches for the linear regression model to predict future observations for unemployment rates in Iraq. Parameters are estimated using the ordinary least squares method and for the Bayesian approach using the Markov Chain Monte Carlo (MCMC) method. Calculations are done using the R program. The analysis showed that the linear regression model using the Bayesian approach is better and can be used as an alternative to the frequentist approach. Two criteria, the root mean square error (RMSE) and the median absolute deviation (MAD) were used to compare the performance of the estimates. The results obtained showed that the unemployment rates will continue to increase in the next two decade
... Show More