Lasers, with their unique characteristics in terms of excellent beam quality, especially directionality and coherency, make them the solution that is key for many processes that require high precision. Lasers have good susceptibility to integrate with automated systems, which provides high flexibility to reach difficult zones. In addition, as a processing tool, a laser can be considered as a contact-free tool of precise tip that became attractive for high precision machining at the micro and nanoscales for different materials. All of the above advantages may be not enough unless the laser technician/engineer has enough knowledge about the mechanism of interaction between the laser light with the processed material. Several sequential phenomena occur when an intense laser beam is incident on the surface of a material. Heating, melting, vaporization and plasma formation are present in the normal interaction of an intense laser beam with matter. This may be followed by additional events such as acoustic and optical emissions, structure shockwaves, thermal effects, structural defects and residual stresses. The process is affected by a lot of variables that can transfer the interaction towards extremely different behavior in terms of colder and fewer side-effect interactions, which yield precise features for the processed material. The most crucial variables are the time scale of interaction and laser wavelength with respect to the properties of the processed material undertaken as well as the laser fluence. The objective of this chapter is to introduce the fundamentals of physical and mathematical concepts of laser and matter interaction and its dependency on different time scale regimes. Interaction with a short and ultra-short laser pulse has attracted a significant amount of interest in industry due to its huge impact in micro-/nanomachining applications.
The research shows the importance of orientation towards the formulation of the green strategy and its effect in determining the behavior of the green municipal institution in Babel governorate. The research highlighted the formulation of the green strategy as an important variable, especially today, the trend towards protecting the environment and minimizing the damage resulting from the delivery of services, and through it also the type of green behavior or performance adopted by the municipal institution and the emergence of the need for a strategy that is not harmful to the environment. The research took the sample intentionally comprehensive size of 222 personnel of municipal institutions and some formations concerned with t
... Show MoreIn the 1980s, the French Administration Roads LCPC developed high modulus mixtures (EME) by using hard binder. This type of mixture presented good resistance to moisture damage and improved mechanical properties for asphalt mixtures including high modulus, good fatigue behaviour and excellent resistance to rutting. In Iraq, this type of mixture has not been used yet. The main objective of this research is to evaluate the performance of high modulus mixtures and comparing them with the conventional mixture, to achieve this objective, asphalt concrete mixes were prepared and then tested to evaluate their engineering properties which include moisture damage, resilient modulus, permanent deformation and fatigue characteristics. These pro
... Show MoreAn integrated GIS-VBA (Geographical Information System – Visual Basic for Application), model is developed for selecting an optimum water harvesting dam location among an available locations in a watershed. The proposed model allows quick and precise estimation of an adopted weighted objective function for each selected location. In addition to that for each location, a different dam height is used as a nominee for optimum selection. The VBA model includes an optimization model with a weighted objective function that includes beneficiary items (positive) , such as the available storage , the dam height allowed by the site as an indicator for the potential of hydroelectric power generation , the rainfall rate as a source of water . In a
... Show MoreThis work evaluates the economic feasibility of various production scenarios for the Zubair reservoir in the Kifl oil field using cash flow and net present value (NPV) calculations. The Kifl field is an exploratory field that has not yet been developed or assessed economically. The first well was drilled in 1960, and three other wells were later drilled to assess the oil accumulation, so in this research, Different production scenarios were evaluated economically. These scenarios were proposed based on the reservoir model of the Zubair formation in the field. The research methodology used QUE$TOR software to estimate capital expenditures (CapEx) and operating expenditures (OpEx) based on field-level data, production prof
... Show MoreThe purpose of present work is to study the relationship of the deformed shape of the nucleus with the radioactivity of nuclei for (Uranium-238 and Thorium-232) series. To achieve our purposes we have been calculated the quadruple deformation parameter (β2) and the eccentricity (e) and compare the radioactive series with the change of the and (e) as indicator for the changing in the nucleus shape with the radioactivity. To obtain the value of quadruple deformation parameter (β2), the adopted value of quadruple transition probability B (E2; 0+ → 2+) was calculated from Global Best fit equation. While the eccentricity (e) was calculated from the values of the minor and major ellipsoid axis’s (a & b). From the results, it is obvi
... Show More
The current research aims to adopt production quality decisions as the most important decisions , because they are accompanied by customer satisfaction through monitoring the quality of drinking water in iraq which reach through the pipeline network associated with water treatment projects of Tigris and Euphrates rivers. One of the indicators of quality control was the drawing of the C-chart by specifying the central line and the upper and lower limit of the control and the diagnosis of whether the production system as a whole within the scope of quality control or not and determine the strength and significance of the correlation between the quantities of water And actual needs for customers , the research has reached a number o
... Show MoreThis paper discusses using H2 and H∞ robust control approaches for designing control systems. These approaches are applied to elementary control system designs, and their respective implementation and pros and cons are introduced. The H∞ control synthesis mainly enforces closed-loop stability, covering some physical constraints and limitations. While noise rejection and disturbance attenuation are more naturally expressed in performance optimization, which can represent the H2 control synthesis problem. The paper also applies these two methodologies to multi-plant systems to study the stability and performance of the designed controllers. Simulation results show that the H2 controller tracks a desirable cl
... Show MoreTransforming the common normal distribution through the generated Kummer Beta model to the Kummer Beta Generalized Normal Distribution (KBGND) had been achieved. Then, estimating the distribution parameters and hazard function using the MLE method, and improving these estimations by employing the genetic algorithm. Simulation is used by assuming a number of models and different sample sizes. The main finding was that the common maximum likelihood (MLE) method is the best in estimating the parameters of the Kummer Beta Generalized Normal Distribution (KBGND) compared to the common maximum likelihood according to Mean Squares Error (MSE) and Mean squares Error Integral (IMSE) criteria in estimating the hazard function. While the pr
... Show MoreThe region-based association analysis has been proposed to capture the collective behavior of sets of variants by testing the association of each set instead of individual variants with the disease. Such an analysis typically involves a list of unphased multiple-locus genotypes with potentially sparse frequencies in cases and controls. To tackle the problem of the sparse distribution, a two-stage approach was proposed in literature: In the first stage, haplotypes are computationally inferred from genotypes, followed by a haplotype coclassification. In the second stage, the association analysis is performed on the inferred haplotype groups. If a haplotype is unevenly distributed between the case and control samples, this haplotype is labeled
... Show More