In the reverse engineering approach, a massive amount of point data is gathered together during data acquisition and this leads to larger file sizes and longer information data handling time. In addition, fitting of surfaces of these data point is time-consuming and demands particular skills. In the present work a method for getting the control points of any profile has been presented. Where, many process for an image modification was explained using Solid Work program, and a parametric equation of the profile that proposed has been derived using Bezier technique with the control points that adopted. Finally, the proposed profile was machined using 3-aixs CNC milling machine and a compression in dimensions process has been occurred between the proposed and original part so as to demonstrate the verification of the proposed method.
Dust storms are typical in arid and semi-arid regions such as the Middle East; the frequency and severity of dust storms have grown dramatically in Iraq in recent years. This paper identifies the dust storm sources in Iraq using remotely sensed data from Meteosat-spinning enhanced visible and infrared imager (SEVIRI) bands. Extracted combined satellite images and simulated frontal dust storm trajectories, using the Hybrid Single Particle Lagrangian Integrated Trajectory (HYSPLIT) model, are used to identify the most influential sources in the Middle East and Iraq. Out of 132 dust storms in Iraq during 2020–2023, the most frequent occurred in the spring and summer. A dust source frequency percentage map (DSFPM) is generated using ArcGIS so
... Show MoreTight reservoirs have attracted the interest of the oil industry in recent years according to its significant impact on the global oil product. Several challenges are present when producing from these reservoirs due to its low to extra low permeability and very narrow pore throat radius. Development strategy selection for these reservoirs such as horizontal well placement, hydraulic fracture design, well completion, and smart production program, wellbore stability all need accurate characterizations of geomechanical parameters for these reservoirs. Geomechanical properties, including uniaxial compressive strength (UCS), static Young’s modulus (Es), and Poisson’s ratio (υs), were measured experimentally using both static and dynamic met
... Show MoreThis paper investigated the treatment of textile wastewater polluted with aniline blue (AB) by electrocoagulation process using stainless steel mesh electrodes with a horizontal arrangement. The experimental design involved the application of the response surface methodology (RSM) to find the mathematical model, by adjusting the current density (4-20 mA/cm2), distance between electrodes (0.5-3 cm), salt concentration (50-600 mg/l), initial dye concentration (50-250 mg/l), pH value (2-12 ) and experimental time (5-20 min). The results showed that time is the most important parameter affecting the performance of the electrocoagulation system. Maximum removal efficiency (96 %) was obtained at a current density of 20 mA/cm2, distance be
... Show MoreIn developing countries, conventional physico-chemical methods are commonly used for removing contaminants. These methods are not efficient and very costly. However, new in site strategy with high treatment efficiency and low operation cost named constructed wetland (CW) has been set. In this study, Phragmites australis was used with free surface batch system to estimate its ability to remediate total
petroleum hydrocarbons (TPH) and chemical oxygen demand (COD) from Al-Daura refinery wastewater. The system operated in semi-batch, thus, new wastewater was weekly added to the plant for 42 days. The results showed high removal percentages (98%) of TPH and (62.3%) for COD. Additionally, Phragmites australis biomass increased significant
The question of estimation took a great interest in some engineering, statistical applications, various applied, human sciences, the methods provided by it helped to identify and accurately the many random processes.
In this paper, methods were used through which the reliability function, risk function, and estimation of the distribution parameters were used, and the methods are (Moment Method, Maximum Likelihood Method), where an experimental study was conducted using a simulation method for the purpose of comparing the methods to show which of these methods are competent in practical application This is based on the observations generated from the Rayleigh logarithmic distribution (RL) with sample sizes
... Show MoreCoronavirus disease (COVID-19) is an acute disease that affects the respiratory system which initially appeared in Wuhan, China. In Feb 2019 the sickness began to spread swiftly throughout the entire planet, causing significant health, social, and economic problems. Time series is an important statistical method used to study and analyze a particular phenomenon, identify its pattern and factors, and use it to predict future values. The main focus of the research is to shed light on the study of SARIMA, NARNN, and hybrid models, expecting that the series comprises both linear and non-linear compounds, and that the ARIMA model can deal with the linear component and the NARNN model can deal with the non-linear component. The models
... Show MoreThis research aims to study the methods of reduction of dimensions that overcome the problem curse of dimensionality when traditional methods fail to provide a good estimation of the parameters So this problem must be dealt with directly . Two methods were used to solve the problem of high dimensional data, The first method is the non-classical method Slice inverse regression ( SIR ) method and the proposed weight standard Sir (WSIR) method and principal components (PCA) which is the general method used in reducing dimensions, (SIR ) and (PCA) is based on the work of linear combinations of a subset of the original explanatory variables, which may suffer from the problem of heterogeneity and the problem of linear
... Show MoreThis deals with estimation of Reliability function and one shape parameter (?) of two- parameters Burr – XII , when ?(shape parameter is known) (?=0.5,1,1.5) and also the initial values of (?=1), while different sample shze n= 10, 20, 30, 50) bare used. The results depend on empirical study through simulation experiments are applied to compare the four methods of estimation, as well as computing the reliability function . The results of Mean square error indicates that Jacknif estimator is better than other three estimators , for all sample size and parameter values
Bootstrap is one of an important re-sampling technique which has given the attention of researches recently. The presence of outliers in the original data set may cause serious problem to the classical bootstrap when the percentage of outliers are higher than the original one. Many methods are proposed to overcome this problem such Dynamic Robust Bootstrap for LTS (DRBLTS) and Weighted Bootstrap with Probability (WBP). This paper try to show the accuracy of parameters estimation by comparison the results of both methods. The bias , MSE and RMSE are considered. The criterion of the accuracy is based on the RMSE value since the method that provide us RMSE value smaller than other is con
... Show More<span lang="EN-US">The need for robotics systems has become an urgent necessity in various fields, especially in video surveillance and live broadcasting systems. The main goal of this work is to design and implement a rover robotic monitoring system based on raspberry pi 4 model B to control this overall system and display a live video by using a webcam (USB camera) as well as using you only look once algorithm-version five (YOLOv5) to detect, recognize and display objects in real-time. This deep learning algorithm is highly accurate and fast and is implemented by Python, OpenCV, PyTorch codes and the Context Object Detection Task (COCO) 2020 dataset. This robot can move in all directions and in different places especially in
... Show More