Most frequently used models for modeling and forecasting periodic climatic time series do not have the capability of handling periodic variability that characterizes it. In this paper, the Fourier Autoregressive model with abilities to analyze periodic variability is implemented. From the results, FAR(1), FAR(2) and FAR(2) models were chosen based on Periodic Autocorrelation function (PeACF) and Periodic Partial Autocorrelation function (PePACF). The coefficients of the tentative model were estimated using a Discrete Fourier transform estimation method. FAR(1) models were chosen as the optimal model based on the smallest values of Periodic Akaike (PAIC) and Bayesian Information criteria (PBIC). The residual of the fitted models was diagnosed to be white noise. The in-sample forecast showed a close reflection of the original rainfall series while the out-sample forecast exhibited a continuous periodic forecast from January 2019 to December 2020 with relatively small values of Periodic Root Mean Square Error (PRMSE), Periodic Mean Absolute Error (PMAE) and Periodic Mean Absolute Percentage Error (PMAPE). The comparison of FAR(1) model forecast with AR(3), ARMA(2,1), ARIMA(2,1,1) and SARIMA( 1,1,1)(1,1,1)12 model forecast indicated that FAR(1) outperformed the other models as it exhibited a continuous periodic forecast. The continuous monthly periodic rainfall forecast indicated that there will be rapid climate change in Nigeria in the coming yearly and Nigerian Government needs to put in place plans to curtail its effects.
One wide-ranging category of open source data is that referring to geospatial information web sites. Despite the advantages of such open source data, including ease of access and cost free data, there is a potential issue of its quality. This article tests the horizontal positional accuracy and possible integration of four web-derived geospatial datasets: OpenStreetMap (OSM), Google Map, Google Earth and Wikimapia. The evaluation was achieved by combining the tested information with reference field survey data for fifty road intersections in Baghdad, Iraq. The results indicate that the free geospatial data can be used to enhance authoritative maps especially small scale maps.
Tourism plays an important role in Malaysia’s economic development as it can boost business opportunity in its surrounding economic. By apply data mining on tourism data for predicting the area of business opportunity is a good choice. Data mining is the process that takes data as input and produces outputs knowledge. Due to the population of travelling in Asia country has increased in these few years. Many entrepreneurs start their owns business but there are some problems such as wrongly invest in the business fields and bad services quality which affected their business income. The objective of this paper is to use data mining technology to meet the business needs and customer needs of tourism enterprises and find the most effective
... Show MoreWith the revolutionized expansion of the Internet, worldwide information increases the application of communication technology, and the rapid growth of significant data volume boosts the requirement to accomplish secure, robust, and confident techniques using various effective algorithms. Lots of algorithms and techniques are available for data security. This paper presents a cryptosystem that combines several Substitution Cipher Algorithms along with the Circular queue data structure. The two different substitution techniques are; Homophonic Substitution Cipher and Polyalphabetic Substitution Cipher in which they merged in a single circular queue with four different keys for each of them, which produces eight different outputs for
... Show MoreThe research aims to determine the strength of the relationship between time management and work pressure and administrative leadership, where he was taken a sample of (47) of the administrative leadership at the Higher Institute of security and administrative development in the Ministry of Interior was used questionnaire as a key tool in collecting data and information and analyzed the answers to the sample surveyed by using Statistical program (spss) in the arithmetic mean of the calculation and test (t) and the correlation coefficient, the research found the most important results: the existence of significant moral positive relationship between both time management and work pressure and administrative leadership, the leadership of th
... Show MoreHartha Formation is an overburdened horizon in the X-oilfield which generates a lot of Non-Productive Time (NPT) associated with drilling mud losses. This study has been conducted to investigate the loss events in this formation as well as to provide geological interpretations based on datasets from nine wells in this field of interest. The interpretation was based on different analyses including wireline logs, cuttings descriptions, image logs, and analog data. Seismic and coherency data were also used to formulate the geological interpretations and calibrate that with the loss events of the Hartha Fm.
The results revealed that the upper part of the Hartha Fm. was identified as an interval capable of creating potentia
... Show MoreIn many oil-recovery systems, relative permeabilities (kr) are essential flow factors that affect fluid dispersion and output from petroleum resources. Traditionally, taking rock samples from the reservoir and performing suitable laboratory studies is required to get these crucial reservoir properties. Despite the fact that kr is a function of fluid saturation, it is now well established that pore shape and distribution, absolute permeability, wettability, interfacial tension (IFT), and saturation history all influence kr values. These rock/fluid characteristics vary greatly from one reservoir region to the next, and it would be impossible to make kr measurements in all of them. The unsteady-state approach was used to calculate the relat
... Show MoreAbstract
The research Compared two methods for estimating fourparametersof the compound exponential Weibull - Poisson distribution which are the maximum likelihood method and the Downhill Simplex algorithm. Depending on two data cases, the first one assumed the original data (Non-polluting), while the second one assumeddata contamination. Simulation experimentswere conducted for different sample sizes and initial values of parameters and under different levels of contamination. Downhill Simplex algorithm was found to be the best method for in the estimation of the parameters, the probability function and the reliability function of the compound distribution in cases of natural and contaminateddata.
... Show More