The purpose of this article is to improve and minimize noise from the signal by studying wavelet transforms and showing how to use the most effective ones for processing and analysis. As both the Discrete Wavelet Transformation method was used, we will outline some transformation techniques along with the methodology for applying them to remove noise from the signal. Proceeds based on the threshold value and the threshold functions Lifting Transformation, Wavelet Transformation, and Packet Discrete Wavelet Transformation. Using AMSE, A comparison was made between them , and the best was selected. When the aforementioned techniques were applied to actual data that was represented by each of the prices, it became evident that the lifting transformation method (LIFTINGW) and the discrete transformation method with a soft threshold function and the Sure threshold value (SURESDW) were the best. Consumer prices will be the dependent variable for the period of 2015–2020, and Iraqi oil (Average price of a barrel of Iraqi oil) will serve as the explanatory variable. The methods described above have proven effective in estimating the nonparametric regression function for the study model. Paper type: Research paper.
In this paper, we derived an estimator of reliability function for Laplace distribution with two parameters using Bayes method with square error loss function, Jeffery’s formula and conditional probability random variable of observation. The main objective of this study is to find the efficiency of the derived Bayesian estimator compared to the maximum likelihood of this function and moment method using simulation technique by Monte Carlo method under different Laplace distribution parameters and sample sizes. The consequences have shown that Bayes estimator has been more efficient than the maximum likelihood estimator and moment estimator in all samples sizes
Intended for getting good estimates with more accurate results, we must choose the appropriate method of estimation. Most of the equations in classical methods are linear equations and finding analytical solutions to such equations is very difficult. Some estimators are inefficient because of problems in solving these equations. In this paper, we will estimate the survival function of censored data by using one of the most important artificial intelligence algorithms that is called the genetic algorithm to get optimal estimates for parameters Weibull distribution with two parameters. This leads to optimal estimates of the survival function. The genetic algorithm is employed in the method of moment, the least squares method and the weighted
... Show MoreDue to the lack of statistical researches in studying with existing (p) of Exogenous Input variables, and there contributed in time series phenomenon as a cause, yielding (q) of Output variables as a result in time series field, to form conceptual idea similar to the Classical Linear Regression that studies the relationship between dependent variable with explanatory variables. So highlight the importance of providing such research to a full analysis of this kind of phenomena important in consumer price inflation in Iraq. Were taken several variables influence and with a direct connection to the phenomenon and analyzed after treating the problem of outliers existence in the observations by (EM) approach, and expand the sample size (n=36) to
... Show MoreThe aim of this study is to propose reliable equations to estimate the in-situ concrete compressive strength from the non-destructive test. Three equations were proposed: the first equation considers the number of rebound hummer only, the second equation consider the ultrasonic pulse velocity only, and the third equation combines the number of rebound hummer and the ultrasonic pulse velocity. The proposed equations were derived from non-linear regression analysis and they were calibrated with the test results of 372 concrete specimens compiled from the literature. The performance of the proposed equations was tested by comparing their strength estimations with those of related existing equations from literature. Comparis
... Show More
It is considered as one of the statistical methods used to describe and estimate the relationship between randomness (Y) and explanatory variables (X). The second is the homogeneity of the variance, in which the dependent variable is a binary response takes two values (One when a specific event occurred and zero when that event did not happen) such as (injured and uninjured, married and unmarried) and that a large number of explanatory variables led to the emergence of the problem of linear multiplicity that makes the estimates inaccurate, and the method of greatest possibility and the method of declination of the letter was used in estimating A double-response logistic regression model by adopting the Jackna
... Show More
It is considered as one of the statistical methods used to describe and estimate the relationship between randomness (Y) and explanatory variables (X). The second is the homogeneity of the variance, in which the dependent variable is a binary response takes two values (One when a specific event occurred and zero when that event did not happen) such as (injured and uninjured, married and unmarried) and that a large number of explanatory variables led to the emergence of the problem of linear multiplicity that makes the estimates inaccurate, and the method of greatest possibility and the method of declination of the letter was used in estimating A double-response logistic regression model by adopting the Jackna
... Show More
Regression testing is a crucial phase in the software development lifecycle that makes sure that new changes/updates in the software system don’t introduce defects or don’t affect adversely the existing functionalities. However, as the software systems grow in complexity, the number of test cases in regression suite can become large which results into more testing time and resource consumption. In addition, the presence of redundant and faulty test cases may affect the efficiency of the regression testing process. Therefore, this paper presents a new Hybrid Framework to Exclude Similar & Faulty Test Cases in Regression Testing (ETCPM) that utilizes automated code analysis techniques and historical test execution data to
... Show MoreCancer is one of the dangerous diseases that afflict a person through injury to cells and tissues in the body, where a person is vulnerable to infection in any age group, and it is not easy to control and multiply between cells and spread to the body. In spite of the great progress in medical studies interested in this aspect, the options for those with this disease are few and difficult, as they require significant financial costs for health services and for treatment that is difficult to provide.
This study dealt with the determinants of liver cancer by relying on the data of cancerous tumours taken from the Iraqi Center for Oncology in the Ministry of Health 2017. Survival analysis has been used as a m
... Show MoreIn this research, a simple experiment in the field of agriculture was studied, in terms of the effect of out-of-control noise as a result of several reasons, including the effect of environmental conditions on the observations of agricultural experiments, through the use of Discrete Wavelet transformation, specifically (The Coiflets transform of wavelength 1 to 2 and the Daubechies transform of wavelength 2 To 3) based on two levels of transform (J-4) and (J-5), and applying the hard threshold rules, soft and non-negative, and comparing the wavelet transformation methods using real data for an experiment with a size of 26 observations. The application was carried out through a program in the language of MATLAB. The researcher concluded that
... Show More