The purpose of this paper is to model and forecast the white oil during the period (2012-2019) using volatility GARCH-class. After showing that squared returns of white oil have a significant long memory in the volatility, the return series based on fractional GARCH models are estimated and forecasted for the mean and volatility by quasi maximum likelihood QML as a traditional method. While the competition includes machine learning approaches using Support Vector Regression (SVR). Results showed that the best appropriate model among many other models to forecast the volatility, depending on the lowest value of Akaike information criterion and Schwartz information criterion, also the parameters must be significant. In addition, the residuals don’t have the serial correlation and ARCH effect, as well as these models, should have a higher value of log-likelihood and SVR-FIGARCH models managed to outperform FIGARCH models with normal and student’s t distributions. The SVR-FIGARCH model exhibited statistical significance and improved accuracy obtained with the SVM technique. Finally, we evaluate the forecasting performance of the various volatility models, and then we choose the best fitting model to forecast the volatility for each series, depending on three forecasting accuracy measures RMSE, MAE, and MAPE.
Gumbel distribution was dealt with great care by researchers and statisticians. There are traditional methods to estimate two parameters of Gumbel distribution known as Maximum Likelihood, the Method of Moments and recently the method of re-sampling called (Jackknife). However, these methods suffer from some mathematical difficulties in solving them analytically. Accordingly, there are other non-traditional methods, like the principle of the nearest neighbors, used in computer science especially, artificial intelligence algorithms, including the genetic algorithm, the artificial neural network algorithm, and others that may to be classified as meta-heuristic methods. Moreover, this principle of nearest neighbors has useful statistical featu
... Show MoreA new approach for baud time (or baud rate) estimation of a random binary signal is presented. This approach utilizes the spectrum of the signal after nonlinear processing in a way that the estimation error can be reduced by simply increasing the number of the processed samples instead of increasing the sampling rate. The spectrum of the new signal is shown to give an accurate estimate about the baud time when there is no apriory information or any restricting preassumptions. The performance of the estimator for random binary square waves perturbed by white Gaussian noise and ISI is evaluated and compared with that of the conventional estimator of the zero crossing detector.
In this paper, we introduce three robust fuzzy estimators of a location parameter based on Buckley’s approach, in the presence of outliers. These estimates were compared using the variance of fuzzy numbers criterion, all these estimates were best of Buckley’s estimate. of these, the fuzzy median was the best in the case of small and medium sample size, and in large sample size, the fuzzy trimmed mean was the best.
Three hundred Iraqi people participated in demographic and attitudes study about red and white meat consumption. The mean age of the participants was 50 SD ± 11 years (mean 30-72); 51% were females and 49% males, mostly in forties who lived ≥ 5 years in Baghdad. The results showed that 80% of individuals prefer red meat. A 90% of people prefer fresh meat compared to frozen and processed meat. A 60% of people buy meat from popular markets. Nearly 87% of respondents believe the improving of livestock sector is essential and 80% of people confirmed there are obstacles to development this sector. An 80% of participates thought the reasons of the high prices of local fresh meat is the lack of plann
... Show MoreThere continues to be a need for an in-situ sensor system to monitor the engine oil of internal combustion engines. Engine oil needs to be monitored for contaminants and depletion of additives. While various sensor systems have been designed and evaluated, there is still a need to develop and evaluate new sensing technologies. This study evaluated Terahertz time-domain spectroscopy (THz-TDS) for the identification and estimation of the glycol contamination of automotive engine oil. Glycol contamination is a result of a gasket or seal leak allowing coolant to enter an engine and mix with the engine oil. An engine oil intended for use in both diesel and gasoline engines was obtained. Fresh engine oil samples were contaminated with fou
... Show MoreThe research seeks to identify the image of foreign oil companies operating in Iraq among the public of Basra, and the research aims to clarify the mental image of foreign oil companies among the Iraqi public, and to identify the extent to which the Iraqi public benefit from the social responsibility programs offered by foreign oil companies and their contribution to improving the standard of living and services for the population. Nearby areas and society as a whole, the research is classified within descriptive research, and the researcher used the survey method for the Iraqi public in Basra governorate, which includes the areas in which these companies are located, and he used the scale tool to find out, so he distributed 600 que
... Show MoreThis paper present the fast and robust approach of English text encryption and decryption based on Pascal matrix. The technique of encryption the Arabic or English text or both and show the result when apply this method on plain text (original message) and how will form the intelligible plain text to be unintelligible plain text in order to secure information from unauthorized access and from steel information, an encryption scheme usually uses a pseudo-random enecryption key generated by an algorithm. All this done by using Pascal matrix. Encryption and decryption are done by using MATLAB as programming language and notepad ++to write the input text.This paper present the fast and robust approach of English text encryption and decryption b
... Show MoreOn of the direct causes which led to the global financial crisis 2008 is decrease or collapse in liquidity of large financial institutions which is reflected on investments of a considerable number of institutions and persons.
This study aim's through out its three sections to explain the disclosure level of financial institutions which affected by Financial Crisis from liquidity information which explained in the statement of cash flow according to Timeliness and Completeness.
The study concluded an important result the company of research sample was disclosure in Timeliness and Completeness from all of accounting information is related in liquidity or that related in result of operations and financial position. The more
... Show MoreCloud computing provides huge amount of area for storage of the data, but with an increase of number of users and size of their data, cloud storage environment faces earnest problem such as saving storage space, managing this large data, security and privacy of data. To save space in cloud storage one of the important methods is data deduplication, it is one of the compression technique that allows only one copy of the data to be saved and eliminate the extra copies. To offer security and privacy of the sensitive data while supporting the deduplication, In this work attacks that exploit the hybrid cloud deduplication have been identified, allowing an attacker to gain access to the files of other users based on very small hash signatures of
... Show More