In this paper, the propose is to use the xtreme value distribution as the rate of occurrence of the non-homogenous Poisson process, in order to improve the rate of occurrence of the non-homogenous process, which has been called the Extreme value Process. To estimate the parameters of this process, it is proposed to use the Maximum Likelihood method, Method of Moment and a smart method represented by the Artificial Bee Colony:(ABC) algorithm to reach an estimator for this process which represents the best data representation. The results of the three methods are compared through a simulation of the model, and it is concluded that the estimator of (ABC) is better than the estimator of the maximum likelihood method and method of moment in estimating the time rate of occurrence of the proposed Extreme value process. The research also includes a realistic application that deals with the operating periods of two successive stops for the raw materials factory from the General Company for Northern Cement / Badush Cement Factories (new) during the period from 1/4/2018 to 31/1/2019, in order to reach the time rate of factory stops.
It is well known that the rate of penetration is a key function for drilling engineers since it is directly related to the final well cost, thus reducing the non-productive time is a target of interest for all oil companies by optimizing the drilling processes or drilling parameters. These drilling parameters include mechanical (RPM, WOB, flow rate, SPP, torque and hook load) and travel transit time. The big challenge prediction is the complex interconnection between the drilling parameters so artificial intelligence techniques have been conducted in this study to predict ROP using operational drilling parameters and formation characteristics. In the current study, three AI techniques have been used which are neural network, fuzzy i
... Show MoreImproving" Jackknife Instrumental Variable Estimation method" using A class of immun algorithm with practical application
In this work, the methods (Moments, Modified Moments, L-Moments, Percentile, Rank Set sampling and Maximum Likelihood) were used to estimate the reliability function and the two parameters of the Transmuted Pareto (TP) distribution. We use simulation to generate the required data from three cases this indicates sample size , and it replicates for the real value for parameters, for reliability times values we take .
Results were compared by using mean square error (MSE), the result appears as follows :
The best methods are Modified Moments, Maximum likelihood and L-Moments in first case, second case and third case respectively.
Ad hoc networks are characterized by ease of setup, low costs, and frequent use in the corporate world. They ensure safety to the user and maintain the confidentiality of the information circulated. They also allow the user to address the cases of communication failure in areas subject to destruction of communication infrastructure. The proposed protocols in the ad hoc networks often build only one path to achieve communication between the nodes, due to the restrictions of battery run out and the movement of the nodes. This connection is often subject to a failure within a certain range. Thus, multiple alternate paths in ad hoc networks use a solution to failing node communications. In addition, when looking at the situation where interf
... Show MoreVariable selection is an essential and necessary task in the statistical modeling field. Several studies have triedto develop and standardize the process of variable selection, but it isdifficultto do so. The first question a researcher needs to ask himself/herself what are the most significant variables that should be used to describe a given dataset’s response. In thispaper, a new method for variable selection using Gibbs sampler techniqueshas beendeveloped.First, the model is defined, and the posterior distributions for all the parameters are derived.The new variable selection methodis tested usingfour simulation datasets. The new approachiscompared with some existingtechniques: Ordinary Least Squared (OLS), Least Absolute Shrinkage
... Show MoreRepresent the current study and tagged (the credibility of digital image and its reflection on the process of cognitive picture releases) scientific effort is designed to detect realizing press releases and the extent affected the credibility of the digital image by selecting the relationship between digital photo and the extent of their credibility on the one hand and between the process of cognition and Press Photo of the hand Other than the consequent establishment researcher collects materials to serve the scientific research topic in three chaptersCombine the first one methodological framework for the search of the research problem and its significance and the desired objective be achieved together with the definition of the most im
... Show MoreIn order to specify the features of higher education process and its quantitative and qualitative development in Iraq ; one should look back at its historical process and the need of interesting with it .
Accordingly , there will be a chance for verifying the demand of the Iraqi society according to the political , social , and cultural changes especially during the national governance (1932 – 1958 ) .
For depicting the most important quantitative and qualitative development of this kind of education the period of 1932 -1958 , and since there is no previous study that tackled this topic , here comes the need of writing this paper .
After historical
... Show MoreSupport Vector Machines (SVMs) are supervised learning models used to examine data sets in order to classify or predict dependent variables. SVM is typically used for classification by determining the best hyperplane between two classes. However, working with huge datasets can lead to a number of problems, including time-consuming and inefficient solutions. This research updates the SVM by employing a stochastic gradient descent method. The new approach, the extended stochastic gradient descent SVM (ESGD-SVM), was tested on two simulation datasets. The proposed method was compared with other classification approaches such as logistic regression, naive model, K Nearest Neighbors and Random Forest. The results show that the ESGD-SVM has a
... Show MoreConditional logistic regression is often used to study the relationship between event outcomes and specific prognostic factors in order to application of logistic regression and utilizing its predictive capabilities into environmental studies. This research seeks to demonstrate a novel approach of implementing conditional logistic regression in environmental research through inference methods predicated on longitudinal data. Thus, statistical analysis of longitudinal data requires methods that can properly take into account the interdependence within-subjects for the response measurements. If this correlation ignored then inferences such as statistical tests and confidence intervals can be invalid largely.