Multiple eliminations (de-multiple) are one of seismic processing steps to remove their effects and delineate the correct primary refractors. Using normal move out to flatten primaries is the way to eliminate multiples through transforming these data to frequency-wavenumber domain. The flatten primaries are aligned with zero axis of the frequency-wavenumber domain and any other reflection types (multiples and random noise) are distributed elsewhere. Dip-filter is applied to pass the aligned data and reject others will separate primaries from multiple after transforming the data back from frequency-wavenumber domain to time-distance domain. For that, a suggested name for this technique as normal move out- frequency-wavenumber domain
... Show MoreThis deals with estimation of Reliability function and one shape parameter (?) of two- parameters Burr – XII , when ?(shape parameter is known) (?=0.5,1,1.5) and also the initial values of (?=1), while different sample shze n= 10, 20, 30, 50) bare used. The results depend on empirical study through simulation experiments are applied to compare the four methods of estimation, as well as computing the reliability function . The results of Mean square error indicates that Jacknif estimator is better than other three estimators , for all sample size and parameter values
In this paper, we investigate two stress-strength models (Bounded and Series) in systems reliability based on Generalized Inverse Rayleigh distribution. To obtain some estimates of shrinkage estimators, Bayesian methods under informative and non-informative assumptions are used. For comparison of the presented methods, Monte Carlo simulations based on the Mean squared Error criteria are applied.
In this paper, the Azzallini’s method used to find a weighted distribution derived from the standard Pareto distribution of type I (SPDTI) by inserting the shape parameter (θ) resulting from the above method to cover the period (0, 1] which was neglected by the standard distribution. Thus, the proposed distribution is a modification to the Pareto distribution of the first type, where the probability of the random variable lies within the period The properties of the modified weighted Pareto distribution of the type I (MWPDTI) as the probability density function ,cumulative distribution function, Reliability function , Moment and the hazard function are found. The behaviour of probability density function for MWPDTI distrib
... Show MoreThe aim of this paper is to translate the basic properties of the classical complete normed algebra to the complete fuzzy normed algebra at this end a proof of multiplication fuzzy continuous is given. Also a proof of every fuzzy normed algebra without identity can be embedded into fuzzy normed algebra with identity and is an ideal in is given. Moreover the proof of the resolvent set of a non zero element in complete fuzzy normed space is equal to the set of complex numbers is given. Finally basic properties of the resolvent space of a complete fuzzy normed algebra is given.
A new distribution, the Epsilon Skew Gamma (ESΓ ) distribution, which was first introduced by Abdulah [1], is used on a near Gamma data. We first redefine the ESΓ distribution, its properties, and characteristics, and then we estimate its parameters using the maximum likelihood and moment estimators. We finally use these estimators to fit the data with the ESΓ distribution
In 2020 one of the researchers in this paper, in his first research, tried to find out the Modified Weighted Pareto Distribution of Type I by using the Azzalini method for weighted distributions, which contain three parameters, two of them for scale while the third for shape.This research compared the distribution with two other distributions from the same family; the Standard Pareto Distribution of Type I and the Generalized Pareto Distribution by using the Maximum likelihood estimator which was derived by the researchers for Modified Weighted Pareto Distribution of Type I, then the Mont Carlo method was used–that is one of the simulation manners for generating random samples data in different sizes ( n= 10,30,50), and in di
... Show MoreIn this study, a fast block matching search algorithm based on blocks' descriptors and multilevel blocks filtering is introduced. The used descriptors are the mean and a set of centralized low order moments. Hierarchal filtering and MAE similarity measure were adopted to nominate the best similar blocks lay within the pool of neighbor blocks. As next step to blocks nomination the similarity of the mean and moments is used to classify the nominated blocks and put them in one of three sub-pools, each one represents certain nomination priority level (i.e., most, less & least level). The main reason of the introducing nomination and classification steps is a significant reduction in the number of matching instances of the pixels belong to the c
... Show MoreThe two parameters of Exponential-Rayleigh distribution were estimated using the maximum likelihood estimation method (MLE) for progressively censoring data. To find estimated values for these two scale parameters using real data for COVID-19 which was taken from the Iraqi Ministry of Health and Environment, AL-Karkh General Hospital. Then the Chi-square test was utilized to determine if the sample (data) corresponded with the Exponential-Rayleigh distribution (ER). Employing the nonlinear membership function (s-function) to find fuzzy numbers for these parameters estimators. Then utilizing the ranking function transforms the fuzzy numbers into crisp numbers. Finally, using mean square error (MSE) to compare the outcomes of the survival
... Show More