In this research, the one of the most important model and widely used in many and applications is linear mixed model, which widely used to analysis the longitudinal data that characterized by the repeated measures form .where estimating linear mixed model by using two methods (parametric and nonparametric) and used to estimate the conditional mean and marginal mean in linear mixed model ,A comparison between number of models is made to get the best model that will represent the mean wind speed in Iraq.The application is concerned with 8 meteorological stations in Iraq that we selected randomly and then we take a monthly data about wind speed over ten years Then average it over each month in corresponding year, so we get different clusters ,each cluster contain 12 observation that represent a mean wind speed for each station . The comparison among the best models are held by using statistical standard the mean square Error(MSE),our conclusion for the parametric model during the application the with additional random effect(the second model) is better than the model without addithonal random effect(the first model)for all station in general,for nonparametric model we found the conditional local mixed model is better than marginal mixed model in estimation the conditional and marginal means for mixed model in general, for marginal mean , where found that the marginal local mixed model is better for all the stations that we were sampled except for the fifth station we found that the conditional local mixed model is better for the marginal local mixed model in estimation of marginal mean mixed model .
This study investigates asset returns within the Iraq Stock Exchange by employing both the Fama-MacBeth regression model and the Fama-French three-factor model. The research involves the estimation of cross-sectional regressions wherein model parameters are subject to temporal variation, and the independent variables function as proxies. The dataset comprises information from the first quarter of 2010 to the first quarter of 2024, encompassing 22 publicly listed companies across six industrial sectors. The study explores methodological advancements through the application of the Single Index Model (SIM) and Kernel Weighted Regression (KWR) in both time series and cross-sectional analyses. The SIM outperformed the K
... Show MoreHuman posture estimation is a crucial topic in the computer vision field and has become a hotspot for research in many human behaviors related work. Human pose estimation can be understood as the human key point recognition and connection problem. The paper presents an optimized symmetric spatial transformation network designed to connect with single-person pose estimation network to propose high-quality human target frames from inaccurate human bounding boxes, and introduces parametric pose non-maximal suppression to eliminate redundant pose estimation, and applies an elimination rule to eliminate similar pose to obtain unique human pose estimation results. The exploratory outcomes demonstrate the way that the proposed technique can pre
... Show MorePulsed liquid laser ablation is considered a green method for the synthesis of nanostructures because there are no byproducts formed after the ablation. In this paper, a fiber laser of wavelength 1.064 µm, peak power of 1 mJ, pulse duration of 120 ns, and repetition rate of 20 kHz, was used to produce carbon nanostructures including carbon nanospheres and carbon nanorods from the ablation of asphalt in ethanol at ablation speeds of (100, 75, 50, 10 mm/s). The morphology, composition and optical properties of the synthesized samples were studied experimentally using FESEM, HRTEM, EDS, and UV-vis spectrophotometer. Results showed that the band gap energy decreased with decreasing the ablation speed (increasing the ablation time), the mi
... Show MoreIn this research we been estimated the survival function for data suffer from the disturbances and confusion of Iraq Household Socio-Economic Survey: IHSES II 2012 , to data from a five-year age groups follow the distribution of the Generalized Gamma: GG. It had been used two methods for the purposes of estimating and fitting which is the way the Principle of Maximizing Entropy: POME, and method of booting to nonparametric smoothing function for Kernel, to overcome the mathematical problems plaguing integrals contained in this distribution in particular of the integration of the incomplete gamma function, along with the use of traditional way in which is the Maximum Likelihood: ML. Where the comparison on the basis of the method of the Cen
... Show MoreIn this paper, the effect of changes in bank deposits on the money supply in Iraq was studied by estimating the error correction model (ECM) for monthly time series data for the period (2010-2015) . The Philips Perron was used to test the stationarity and also we used Engle and Granger to test the cointegration . we used cubic spline and local polynomial estimator to estimate regression function .The result show that local polynomial was better than cubic spline with the first level of cointegration.
The use of Bayesian approach has the promise of features indicative of regression analysis model classification tree to take advantage of the above information by, and ensemble trees for explanatory variables are all together and at every stage on the other. In addition to obtaining the subsequent information at each node in the construction of these classification tree. Although bayesian estimates is generally accurate, but it seems that the logistic model is still a good competitor in the field of binary responses through its flexibility and mathematical representation. So is the use of three research methods data processing is carried out, namely: logistic model, and model classification regression tree, and bayesian regression tree mode
... Show MoreThere is a great operational risk to control the day-to-day management in water treatment plants, so water companies are looking for solutions to predict how the treatment processes may be improved due to the increased pressure to remain competitive. This study focused on the mathematical modeling of water treatment processes with the primary motivation to provide tools that can be used to predict the performance of the treatment to enable better control of uncertainty and risk. This research included choosing the most important variables affecting quality standards using the correlation test. According to this test, it was found that the important parameters of raw water: Total Hardn
Linear programming currently occupies a prominent position in various fields and has wide applications, as its importance lies in being a means of studying the behavior of a large number of systems as well. It is also the simplest and easiest type of models that can be created to address industrial, commercial, military and other dilemmas. Through which to obtain the optimal quantitative value. In this research, we dealt with the post optimality solution, or what is known as sensitivity analysis, using the principle of shadow prices. The scientific solution to any problem is not a complete solution once the optimal solution is reached. Any change in the values of the model constants or what is known as the inputs of the model that will chan
... Show MoreThe purpose of this paper is to model and forecast the white oil during the period (2012-2019) using volatility GARCH-class. After showing that squared returns of white oil have a significant long memory in the volatility, the return series based on fractional GARCH models are estimated and forecasted for the mean and volatility by quasi maximum likelihood QML as a traditional method. While the competition includes machine learning approaches using Support Vector Regression (SVR). Results showed that the best appropriate model among many other models to forecast the volatility, depending on the lowest value of Akaike information criterion and Schwartz information criterion, also the parameters must be significant. In addition, the residuals
... Show More