Meteorological data mining and hybrid data-intelligence models for reference evaporation simulation: A case study in Iraq
...Show More Authors
Among the metaheuristic algorithms, population-based algorithms are an explorative search algorithm superior to the local search algorithm in terms of exploring the search space to find globally optimal solutions. However, the primary downside of such algorithms is their low exploitative capability, which prevents the expansion of the search space neighborhood for more optimal solutions. The firefly algorithm (FA) is a population-based algorithm that has been widely used in clustering problems. However, FA is limited in terms of its premature convergence when no neighborhood search strategies are employed to improve the quality of clustering solutions in the neighborhood region and exploring the global regions in the search space. On the
... Show MoreA dust storm in Iraq is a climatic phenomenon common in arid and semi-arid regions . The frequency of the occurrence has increased drastically in the last decade and it is increasing continuously .Baghdad city like the rest of Iraq is suffering from the significant increase in dust storms . In this research , the study of the phenomenon of dust storms for all types (Suspended dust , rising dust , dust storm) , and its relationship with some climate variables (Temperature , rainfall ,wind speed) .The statement of the impact of climate change on this phenomenon to Baghdad station for the period (1981 – 2012) . Time series has been addressing the phenomenon of storms and cli
... Show MoreThe achievements of the art that we know today are questioned in motives that differ from what art knew before, including dramatic artistic transformations, which he called modern art.
In view of the enormity of such a topic, its ramifications and its complexity, it was necessary to confine its subject to the origin of the motives of the transformations of its first pioneers, and then to stand on what resulted from that of the data of vision in composition and drawing exclusively, and through exploration in that, we got to know the vitality of change from the art of its time.
And by examining the ruling contemporary philosophical concepts and their new standards and their epistemological role in contemporary life, since they includ
Porosity is important because it reflects the presence of oil reserves. Hence, the number of underground reserves and a direct influence on the essential petrophysical parameters, such as permeability and saturation, are related to connected pores. Also, the selection of perforation interval and recommended drilling additional infill wells. For the estimation two distinct methods are used to obtain the results: the first method is based on conventional equations that utilize porosity logs. In contrast, the second approach relies on statistical methods based on making matrices dependent on rock and fluid composition and solving the equations (matrices) instantaneously. In which records have entered as equations, and the matrix is sol
... Show MoreMultilayer reservoirs are currently modeled as a single zone system by averaging the reservoir parameters associated with each reservoir zone. However, this type of modeling is rarely accurate because a single zone system does not account for the fact that each zone's pressure decreases independently. Pressure drop for each zone has an effect on the total output and would result in inter-flow and the premature depletion of one of the zones. Understanding reservoir performance requires a precise estimation of each layer's permeability and skin factor. The Multilayer Transient Analysis is a well-testing technique designed to determine formation properties in more than one layer, and its effectiveness over the past two decades has been
... Show MoreJournal of Theoretical and Applied Information Technology is a peer-reviewed electronic research papers & review papers journal with aim of promoting and publishing original high quality research dealing with theoretical and scientific aspects in all disciplines of IT (Informaiton Technology
Anomaly detection is still a difficult task. To address this problem, we propose to strengthen DBSCAN algorithm for the data by converting all data to the graph concept frame (CFG). As is well known that the work DBSCAN method used to compile the data set belong to the same species in a while it will be considered in the external behavior of the cluster as a noise or anomalies. It can detect anomalies by DBSCAN algorithm can detect abnormal points that are far from certain set threshold (extremism). However, the abnormalities are not those cases, abnormal and unusual or far from a specific group, There is a type of data that is do not happen repeatedly, but are considered abnormal for the group of known. The analysis showed DBSCAN using the
... Show MoreSurvival analysis is one of the types of data analysis that describes the time period until the occurrence of an event of interest such as death or other events of importance in determining what will happen to the phenomenon studied. There may be more than one endpoint for the event, in which case it is called Competing risks. The purpose of this research is to apply the dynamic approach in the analysis of discrete survival time in order to estimate the effect of covariates over time, as well as modeling the nonlinear relationship between the covariates and the discrete hazard function through the use of the multinomial logistic model and the multivariate Cox model. For the purpose of conducting the estimation process for both the discrete
... Show More