Preferred Language
Articles
/
lhdMfJIBVTCNdQwC9LH4
Hazard Rate Estimation Using Varying Kernel Function for Censored Data Type I Article Sidebar
...Show More Authors

n this research, several estimators concerning the estimation are introduced. These estimators are closely related to the hazard function by using one of the nonparametric methods namely the kernel function for censored data type with varying bandwidth and kernel boundary. Two types of bandwidth are used: local bandwidth and global bandwidth. Moreover, four types of boundary kernel are used namely: Rectangle, Epanechnikov, Biquadratic and Triquadratic and the proposed function was employed with all kernel functions. Two different simulation techniques are also used for two experiments to compare these estimators. In most of the cases, the results have proved that the local bandwidth is the best for all the types of the kernel boundary functions and suggested that the 2xRectangle and 2xEpanechnikov methods reflect the best results if compared to the other estimators

Scopus Clarivate Crossref
View Publication
Publication Date
Fri Jan 01 2021
Journal Name
International Journal Of Agricultural And Statistical Sciences
A noval SVR estimation of figarch modal and forecasting for white oil data in Iraq
...Show More Authors

The purpose of this paper is to model and forecast the white oil during the period (2012-2019) using volatility GARCH-class. After showing that squared returns of white oil have a significant long memory in the volatility, the return series based on fractional GARCH models are estimated and forecasted for the mean and volatility by quasi maximum likelihood QML as a traditional method. While the competition includes machine learning approaches using Support Vector Regression (SVR). Results showed that the best appropriate model among many other models to forecast the volatility, depending on the lowest value of Akaike information criterion and Schwartz information criterion, also the parameters must be significant. In addition, the residuals

... Show More
View Publication Preview PDF
Scopus
Publication Date
Sat Sep 11 2010
Journal Name
Journal Of Al-nahrain University
ESTIMATION ACTIVITY OF LAP IN PATIENT S WITH TYPE 2 DIABETES BY USING LEUCINE AMIDE AS SUBSTRATE
...Show More Authors

This study was performd on 50 serum specimens of patients with type 2 diabetes, in addition, 50 normal specimens were investigated as control group. The activity rate of LAP in patients (560.46 10.504) I.U/L and activity rate of LAP in healthy(10.58 4.39)I.U/L.The results of the study reveal that Leucine aminopeptidase (LAP) activity of type 2 diabetes patient s serum shows a high signifiacant increase (p < 0.001) compare to healthy subjects. Addition preparation leucine amide as substrate of LAP, identification melting point and spectra by FTIR. K

Publication Date
Thu Jun 01 2017
Journal Name
Journal Of Economics And Administrative Sciences
Compared of estimating two methods for nonparametric function to cluster data for the white blood cells to leukemia patients
...Show More Authors

 

Abstract:                                        

   We can notice cluster data in social, health and behavioral sciences, so this type of data have a link between its observations and we can express these clusters through the relationship between measurements on units within the same group.

    In this research, I estimate the reliability function of cluster function by using the seemingly unrelate

... Show More
View Publication Preview PDF
Crossref
Publication Date
Fri Jan 01 2021
Journal Name
Annals Of Pure And Applied Mathematics
Linear Regression Model Using Bayesian Approach for Iraqi Unemployment Rate
...Show More Authors

In this paper we used frequentist and Bayesian approaches for the linear regression model to predict future observations for unemployment rates in Iraq. Parameters are estimated using the ordinary least squares method and for the Bayesian approach using the Markov Chain Monte Carlo (MCMC) method. Calculations are done using the R program. The analysis showed that the linear regression model using the Bayesian approach is better and can be used as an alternative to the frequentist approach. Two criteria, the root mean square error (RMSE) and the median absolute deviation (MAD) were used to compare the performance of the estimates. The results obtained showed that the unemployment rates will continue to increase in the next two decade

... Show More
View Publication Preview PDF
Crossref (1)
Crossref
Publication Date
Fri Jan 31 2020
Journal Name
Iraqi Geological Journal
ESTIMATION OF SHEAR WAVE VELOCITY FROM WIRELINE LOGS DATA FOR AMARA OILFIELD, MISHRIF FORMATION, SOUTHERN IRAQ
...Show More Authors

Shear wave velocity is an important feature in the seismic exploration that could be utilized in reservoir development strategy and characterization. Its vital applications in petrophysics, seismic, and geomechanics to predict rock elastic and inelastic properties are essential elements of good stability and fracturing orientation, identification of matrix mineral and gas-bearing formations. However, the shear wave velocity that is usually obtained from core analysis which is an expensive and time-consuming process and dipole sonic imager tool is not commonly available in all wells. In this study, a statistical method is presented to predict shear wave velocity from wireline log data. The model concentrated to predict shear wave velocity fr

... Show More
View Publication
Crossref (2)
Crossref
Publication Date
Wed Oct 17 2018
Journal Name
Journal Of Economics And Administrative Sciences
New Robust Estimation in Compound Exponential Weibull-Poisson Distribution for both contaminated and non-contaminated Data
...Show More Authors

Abstract

The research Compared two methods for estimating fourparametersof the compound exponential Weibull - Poisson distribution which are the maximum likelihood method and the Downhill Simplex algorithm. Depending on two data cases, the first one assumed the original data (Non-polluting), while the second one assumeddata contamination. Simulation experimentswere conducted for different sample sizes and initial values of parameters and under different levels of contamination. Downhill Simplex algorithm was found to be the best method for in the estimation of the parameters, the probability function and the reliability function of the compound distribution in cases of natural and contaminateddata.

 

... Show More
View Publication Preview PDF
Crossref
Publication Date
Mon Jun 05 2023
Journal Name
Journal Of Engineering
Exact Stiffness Matrix for Nonprismatic Beams with Parabolic Varying Depth
...Show More Authors

In this paper, an exact stiffness matrix and fixed-end load vector for nonprismatic beams having parabolic varying depth are derived. The principle of strain energy is used in the derivation of the stiffness matrix.
The effect of both shear deformation and the coupling between axial force and the bending moment are considered in the derivation of stiffness matrix. The fixed-end load vector for elements under uniformly distributed or concentrated loads is also derived. The correctness of the derived matrices is verified by numerical examples. It is found that the coupling effect between axial force and bending moment is significant for elements having axial end restraint. It was found that the decrease in bending moment was
in the

... Show More
View Publication Preview PDF
Crossref
Publication Date
Fri Jul 01 2016
Journal Name
Journal Of Economics And Administrative Sciences
Comparison some of methods wavelet estimation for non parametric regression function with missing response variable at random
...Show More Authors

Abstract

 The problem of missing data represents a major obstacle before researchers in the process of data analysis in different fields since , this problem is a recurrent one in all fields of study including social , medical , astronomical and clinical experiments .

The presence of such a problem within the data to be studied may influence negatively on the analysis and it may lead to misleading conclusions , together with the fact that these conclusions that result from a great bias caused by that problem in spite of the efficiency of wavelet methods but they are also affected by the missing of data , in addition to the impact of the problem of miss of accuracy estimation

... Show More
View Publication Preview PDF
Crossref
Publication Date
Tue Dec 01 2020
Journal Name
Journal Of Economics And Administrative Sciences
Using Kernel Density Estimator To Determine the Limits of Multivariate Control Charts.
...Show More Authors

Quality control is an effective statistical tool in the field of controlling the productivity to monitor and confirm the manufactured products to the standard qualities and the certified criteria for some products and services and its main purpose is to cope with the production and industrial development in the business and competitive market. Quality control charts are used to monitor the qualitative properties of the production procedures in addition to detecting the abnormal deviations in the production procedure. The multivariate Kernel Density Estimator control charts method was used which is one of the nonparametric methods that doesn’t require any assumptions regarding the distribution o

... Show More
View Publication Preview PDF
Crossref
Publication Date
Thu Feb 01 2024
Journal Name
Baghdad Science Journal
Estimating the Parameters of Exponential-Rayleigh Distribution for Progressively Censoring Data with S- Function about COVID-19
...Show More Authors

The two parameters of Exponential-Rayleigh distribution were estimated using the maximum likelihood estimation method (MLE) for progressively censoring data. To find estimated values for these two scale parameters using real data for COVID-19 which was taken from the Iraqi Ministry of Health and Environment, AL-Karkh General Hospital. Then the Chi-square test was utilized to determine if the sample (data) corresponded with the Exponential-Rayleigh distribution (ER). Employing the nonlinear membership function (s-function) to find fuzzy numbers for these parameters estimators. Then utilizing the ranking function transforms the fuzzy numbers into crisp numbers. Finally, using mean square error (MSE) to compare the outcomes of the survival

... Show More
View Publication Preview PDF
Scopus Crossref