Fractal image compression gives some desirable properties like fast decoding image, and very good rate-distortion curves, but suffers from a high encoding time. In fractal image compression a partitioning of the image into ranges is required. In this work, we introduced good partitioning process by means of merge approach, since some ranges are connected to the others. This paper presents a method to reduce the encoding time of this technique by reducing the number of range blocks based on the computing the statistical measures between them . Experimental results on standard images show that the proposed method yields minimize (decrease) the encoding time and remain the quality results passable visually.
Dust storms are typical in arid and semi-arid regions such as the Middle East; the frequency and severity of dust storms have grown dramatically in Iraq in recent years. This paper identifies the dust storm sources in Iraq using remotely sensed data from Meteosat-spinning enhanced visible and infrared imager (SEVIRI) bands. Extracted combined satellite images and simulated frontal dust storm trajectories, using the Hybrid Single Particle Lagrangian Integrated Trajectory (HYSPLIT) model, are used to identify the most influential sources in the Middle East and Iraq. Out of 132 dust storms in Iraq during 2020–2023, the most frequent occurred in the spring and summer. A dust source frequency percentage map (DSFPM) is generated using ArcGIS so
... Show MoreTight reservoirs have attracted the interest of the oil industry in recent years according to its significant impact on the global oil product. Several challenges are present when producing from these reservoirs due to its low to extra low permeability and very narrow pore throat radius. Development strategy selection for these reservoirs such as horizontal well placement, hydraulic fracture design, well completion, and smart production program, wellbore stability all need accurate characterizations of geomechanical parameters for these reservoirs. Geomechanical properties, including uniaxial compressive strength (UCS), static Young’s modulus (Es), and Poisson’s ratio (υs), were measured experimentally using both static and dynamic met
... Show MoreThis paper investigated the treatment of textile wastewater polluted with aniline blue (AB) by electrocoagulation process using stainless steel mesh electrodes with a horizontal arrangement. The experimental design involved the application of the response surface methodology (RSM) to find the mathematical model, by adjusting the current density (4-20 mA/cm2), distance between electrodes (0.5-3 cm), salt concentration (50-600 mg/l), initial dye concentration (50-250 mg/l), pH value (2-12 ) and experimental time (5-20 min). The results showed that time is the most important parameter affecting the performance of the electrocoagulation system. Maximum removal efficiency (96 %) was obtained at a current density of 20 mA/cm2, distance be
... Show MoreIn developing countries, conventional physico-chemical methods are commonly used for removing contaminants. These methods are not efficient and very costly. However, new in site strategy with high treatment efficiency and low operation cost named constructed wetland (CW) has been set. In this study, Phragmites australis was used with free surface batch system to estimate its ability to remediate total
petroleum hydrocarbons (TPH) and chemical oxygen demand (COD) from Al-Daura refinery wastewater. The system operated in semi-batch, thus, new wastewater was weekly added to the plant for 42 days. The results showed high removal percentages (98%) of TPH and (62.3%) for COD. Additionally, Phragmites australis biomass increased significant
The question of estimation took a great interest in some engineering, statistical applications, various applied, human sciences, the methods provided by it helped to identify and accurately the many random processes.
In this paper, methods were used through which the reliability function, risk function, and estimation of the distribution parameters were used, and the methods are (Moment Method, Maximum Likelihood Method), where an experimental study was conducted using a simulation method for the purpose of comparing the methods to show which of these methods are competent in practical application This is based on the observations generated from the Rayleigh logarithmic distribution (RL) with sample sizes
... Show MoreCoronavirus disease (COVID-19) is an acute disease that affects the respiratory system which initially appeared in Wuhan, China. In Feb 2019 the sickness began to spread swiftly throughout the entire planet, causing significant health, social, and economic problems. Time series is an important statistical method used to study and analyze a particular phenomenon, identify its pattern and factors, and use it to predict future values. The main focus of the research is to shed light on the study of SARIMA, NARNN, and hybrid models, expecting that the series comprises both linear and non-linear compounds, and that the ARIMA model can deal with the linear component and the NARNN model can deal with the non-linear component. The models
... Show MoreThis research aims to study the methods of reduction of dimensions that overcome the problem curse of dimensionality when traditional methods fail to provide a good estimation of the parameters So this problem must be dealt with directly . Two methods were used to solve the problem of high dimensional data, The first method is the non-classical method Slice inverse regression ( SIR ) method and the proposed weight standard Sir (WSIR) method and principal components (PCA) which is the general method used in reducing dimensions, (SIR ) and (PCA) is based on the work of linear combinations of a subset of the original explanatory variables, which may suffer from the problem of heterogeneity and the problem of linear
... Show MoreThis deals with estimation of Reliability function and one shape parameter (?) of two- parameters Burr – XII , when ?(shape parameter is known) (?=0.5,1,1.5) and also the initial values of (?=1), while different sample shze n= 10, 20, 30, 50) bare used. The results depend on empirical study through simulation experiments are applied to compare the four methods of estimation, as well as computing the reliability function . The results of Mean square error indicates that Jacknif estimator is better than other three estimators , for all sample size and parameter values
Bootstrap is one of an important re-sampling technique which has given the attention of researches recently. The presence of outliers in the original data set may cause serious problem to the classical bootstrap when the percentage of outliers are higher than the original one. Many methods are proposed to overcome this problem such Dynamic Robust Bootstrap for LTS (DRBLTS) and Weighted Bootstrap with Probability (WBP). This paper try to show the accuracy of parameters estimation by comparison the results of both methods. The bias , MSE and RMSE are considered. The criterion of the accuracy is based on the RMSE value since the method that provide us RMSE value smaller than other is con
... Show MoreIn light of the increasing interest in Child-rearing in nurseries and kindergartens and the most important experiences gained by the child at this stage that form the basis for the subsequent stages of her/his physical mental and social growth.
The significance of the research concentrates the need to asses the affecting variables on the child growth to create opportunities for her/him to have intact rearing.
The research also aims to classify these variables at each age level and highlight its moral role.
The problem of the research is the lack of clarity of different variables impact of the child growth in different age levels in nurseries and kindergart
... Show More