Improving" Jackknife Instrumental Variable Estimation method" using A class of immun algorithm with practical application
Linear programming currently occupies a prominent position in various fields and has wide applications, as its importance lies in being a means of studying the behavior of a large number of systems as well. It is also the simplest and easiest type of models that can be created to address industrial, commercial, military and other dilemmas. Through which to obtain the optimal quantitative value. In this research, we dealt with the post optimality solution, or what is known as sensitivity analysis, using the principle of shadow prices. The scientific solution to any problem is not a complete solution once the optimal solution is reached. Any change in the values of the model constants or what is known as the inputs of the model that will chan
... Show MoreIntended for getting good estimates with more accurate results, we must choose the appropriate method of estimation. Most of the equations in classical methods are linear equations and finding analytical solutions to such equations is very difficult. Some estimators are inefficient because of problems in solving these equations. In this paper, we will estimate the survival function of censored data by using one of the most important artificial intelligence algorithms that is called the genetic algorithm to get optimal estimates for parameters Weibull distribution with two parameters. This leads to optimal estimates of the survival function. The genetic algorithm is employed in the method of moment, the least squares method and the weighted
... Show MoreIn many scientific fields, Bayesian models are commonly used in recent research. This research presents a new Bayesian model for estimating parameters and forecasting using the Gibbs sampler algorithm. Posterior distributions are generated using the inverse gamma distribution and the multivariate normal distribution as prior distributions. The new method was used to investigate and summaries Bayesian statistics' posterior distribution. The theory and derivation of the posterior distribution are explained in detail in this paper. The proposed approach is applied to three simulation datasets of 100, 300, and 500 sample sizes. Also, the procedure was extended to the real dataset called the rock intensity dataset. The actual dataset is collecte
... Show MoreThe estimation of the initial oil in place is a crucial topic in the period of exploration, appraisal, and development of the reservoir. In the current work, two conventional methods were used to determine the Initial Oil in Place. These two methods are a volumetric method and a reservoir simulation method. Moreover, each method requires a type of data whereet al the volumetric method depends on geological, core, well log and petrophysical properties data while the reservoir simulation method also needs capillary pressure versus water saturation, fluid production and static pressure data for all active wells at the Mishrif reservoir. The petrophysical properties for the studied reservoir is calculated using neural network technique
... Show MoreThe improvement in Direction of Arrival (DOA) estimation when the received signals impinge on Active-Parasitic Antenna (APA) arrays will be studied in this work. An APA array consists of several active antennas; others are parasitic antennas. The responses to the received signals are measured at the loaded terminals of the active element. The terminals of the parasitic element are shorted. The effect of the received signals on the parasites, i.e., the induced short-circuit current, is mutually coupled to the active elements. Eigen decomposition of the covariance matrix of the measurements of the APA array generates a third subspace in addition to the traditional signal and noise subspaces generated by the all-active ante
... Show MoreExcessive skewness which occurs sometimes in the data is represented as an obstacle against normal distribution. So, recent studies have witnessed activity in studying the skew-normal distribution (SND) that matches the skewness data which is regarded as a special case of the normal distribution with additional skewness parameter (α), which gives more flexibility to the normal distribution. When estimating the parameters of (SND), we face the problem of the non-linear equation and by using the method of Maximum Likelihood estimation (ML) their solutions will be inaccurate and unreliable. To solve this problem, two methods can be used that are: the genetic algorithm (GA) and the iterative reweighting algorithm (IR) based on the M
... Show MoreThe 3-parameter Weibull distribution is used as a model for failure since this distribution is proper when the failure rate somewhat high in starting operation and these rates will be decreased with increasing time .
In practical side a comparison was made between (Shrinkage and Maximum likelihood) Estimators for parameter and reliability function using simulation , we conclude that the Shrinkage estimators for parameters are better than maximum likelihood estimators but the maximum likelihood estimator for reliability function is the better using statistical measures (MAPE)and (MSE) and for different sample sizes.
Note:- ns : small sample ; nm=median sample
... Show MoreIn this study, the thermal buckling behavior of composite laminate plates cross-ply and angle-ply all edged simply supported subjected to a uniform temperature field is investigated, using a simple trigonometric shear deformation theory. Four unknown variables are involved in the theory, and satisfied the zero traction boundary condition on the surface without using shear correction factors, Hamilton's principle is used to derive equations of motion depending on a Simple Four Variable Plate Theory for cross-ply and angle-ply, and then solved through Navier's double trigonometric sequence, to obtain critical buckling temperature for laminated composite plates. Effect of changing some design parameters such as, ortho
... Show MoreThe consensus algorithm is the core mechanism of blockchain and is used to ensure data consistency among blockchain nodes. The PBFT consensus algorithm is widely used in alliance chains because it is resistant to Byzantine errors. However, the present PBFT (Practical Byzantine Fault Tolerance) still has issues with master node selection that is random and complicated communication. The IBFT consensus technique, which is enhanced, is proposed in this study and is based on node trust value and BLS (Boneh-Lynn-Shacham) aggregate signature. In IBFT, multi-level indicators are used to calculate the trust value of each node, and some nodes are selected to take part in network consensus as a result of this calculation. The master node is chosen
... Show MoreReservoir characterization plays a crucial role in comprehending the distribution of formation properties and fluids within heterogeneous reservoirs. This knowledge is instrumental in constructing an accurate three-dimensional model of the reservoir, facilitating predictions regarding porosity, permeability, and fluid flow distribution. Among the various methods employed for reservoir characterization, the hydraulic flow unit stands out as a widely adopted approach. By effectively subdividing the reservoir into distinct zones, each characterized by unique petrophysical and geological properties, hydraulic flow units enable comprehensive reservoir analysis. The concept of the flow unit is closely tied to the flow zone indicator, a cr
... Show More