Preferred Language
Articles
/
ORdZ_ZMBVTCNdQwCEex3
Nadaraya-Watson Estimation of a Circular Regression Model on Peak Systolic Blood Pressure Data
...Show More Authors

Purpose: The research aims to estimate models representing phenomena that follow the logic of circular (angular) data, accounting for the 24-hour periodicity in measurement. Theoretical framework: The regression model is developed to account for the periodic nature of the circular scale, considering the periodicity in the dependent variable y, the explanatory variables x, or both. Design/methodology/approach: Two estimation methods were applied: a parametric model, represented by the Simple Circular Regression (SCR) model, and a nonparametric model, represented by the Nadaraya-Watson Circular Regression (NW) model. The analysis used real data from 50 patients at Al-Kindi Teaching Hospital in Baghdad. Findings: The Mean Circular Error (MCE) criterion was used to compare the two models, leading to the conclusion that the Nadaraya-Watson (NW) circular model outperformed the parametric model in estimating the parameters of the circular regression model. Research, Practical & Social Implications: The recommendation emphasized using the Nadaraya-Watson nonparametric smoothing method to capture the nonlinearity in the data. Originality/value: The results indicated that the Nadaraya-Watson circular model (NW) outperformed the parametric model.      Paper type Research paper.

Crossref
View Publication Preview PDF
Quick Preview PDF
Publication Date
Thu Apr 27 2017
Journal Name
Ibn Al-haitham Journal For Pure And Applied Sciences
Reduce Waiting Times in the Multiple Server queuing model (M, M, C) (FCFS, ∞, ∞) "Model Proposal"
...Show More Authors

   The research aims to propose a plan to reduce the waiting times in the Multiple Server queuing model (M, M, C) (FCFS, ∞, ∞), and adopt this plan, mainly on the arrival rate (λ), some process have been achieved in order to reduce the arrival rate per service channel that should  reduces the overall waiting time in the system.  This research is on two sections where the first deals with theory and how it has been approved the proposed method in theory and in mathematical equations as well as the second section, which dealt with the practical goal of applying the proposed method and comparing it with the traditional way, which was followed in calculating the performance measures in this model.  
&

... Show More
View Publication Preview PDF
Publication Date
Sun Jan 30 2022
Journal Name
Iraqi Journal Of Science
A Survey on Blind De-Blurring of Digital Image
...Show More Authors

      Nowadays, huge digital images are used and transferred via the Internet. It has been the primary source of information in several domains in recent years. Blur image is one of the most common difficult challenges in image processing, which is caused via object movement or a camera shake. De-blurring is the main process to restore the sharp original image, so many techniques have been proposed, and a large number of research papers have been published to remove blurring from the image. This paper presented a review for the recent papers related to de-blurring published in the recent years (2017-2020). This paper focused on discussing various strategies related to enhancing the software's for image de-blur.&n

... Show More
View Publication Preview PDF
Scopus (3)
Crossref (3)
Scopus Crossref
Publication Date
Tue Mar 01 2011
Journal Name
Journal Of Economics And Administrative Sciences
Laplace Distribution And Probabilistic (bi) In Linear Programming Model
...Show More Authors

The theory of probabilistic programming  may be conceived in several different ways. As a method of programming it analyses the implications of probabilistic variations in the parameter space of linear or nonlinear programming model. The generating mechanism of such probabilistic variations in the economic models may be due to incomplete information about changes in demand, pro­duction and technology, specification errors about the econometric relations presumed for different economic agents, uncertainty of various sorts and the consequences of imperfect aggregation or disaggregating of economic variables. In this Research we discuss the probabilistic programming problem when the coefficient bi is random variable

... Show More
View Publication Preview PDF
Crossref
Publication Date
Tue Jun 01 2021
Journal Name
Baghdad Science Journal
Comparing Weibull Stress – Strength Reliability Bayesian Estimators for Singly Type II Censored Data under Different loss Functions
...Show More Authors

     The stress(Y) – strength(X) model reliability Bayesian estimation which defines life of a component with strength X and stress Y (the component fails if and only if at any time the applied stress is greater than its strength) has been studied, then the reliability; R=P(Y<X), can be considered as a measure of the component performance. In this paper, a Bayesian analysis has been considered for R when the two variables X and Y are independent Weibull random variables with common parameter α in order to study the effect of each of the two different scale parameters β and λ; respectively, using three different [weighted, quadratic and entropy] loss functions under two different prior functions [Gamma and extension of Jeffery

... Show More
View Publication Preview PDF
Scopus (2)
Scopus Clarivate Crossref
Publication Date
Sun Dec 01 2019
Journal Name
Journal Of Economics And Administrative Sciences
Use Generalized Pareto Survival Models to Estimation Optimal Survival Time for Myocardial Infarction Patients
...Show More Authors

The survival analysis is one of the modern methods of analysis that is based on the fact that the dependent variable represents time until the event concerned in the study. There are many survival models that deal with the impact of explanatory factors on the likelihood of survival, including the models proposed by the world, David Cox, one of the most important and common models of survival, where it consists of two functions, one of which is a parametric function that does not depend on the survival time and the other a nonparametric function that depends on times of survival, which the Cox model is defined as a semi parametric model, The set of parametric models that depend on the time-to-event distribution parameters such as

... Show More
View Publication Preview PDF
Crossref
Publication Date
Wed Aug 01 2012
Journal Name
International Journal Of Geographical Information Science
Assessing similarity matching for possible integration of feature classifications of geospatial data from official and informal sources
...Show More Authors

View Publication
Scopus (61)
Crossref (48)
Scopus Clarivate Crossref
Publication Date
Sun Aug 06 2017
Journal Name
Ibn Al-haitham Journal For Pure And Applied Sciences
Determination of pKa and Thermodynamic Data of Some Schiff Bases Derived From 4,6-Dimethyl 2-Amino Pyrimidine
...Show More Authors

Acid dissociation constants of some Schiff bases derived from 4, 6-dimethyl 2-amino pyrimidine of the type (1) in 50% V/V dioxane-water mixture in 0.003M KCl, at three different temperatures were determined potentiometrically. The thermodynamic energies were calculated and a good linear correlation was obtained between pKa and IR OH. Stretching frequencies.

View Publication Preview PDF
Publication Date
Tue Dec 01 2020
Journal Name
Journal Of Economics And Administrative Sciences
Use The moment method to Estimate the Reliability Function Of The Data Of Truncated Skew Normal Distribution
...Show More Authors

The Estimation Of The Reliability Function Depends On The Accuracy Of The Data Used To Estimate The Parameters Of The Probability distribution, and Because Some Data Suffer from a Skew in their Data to Estimate the Parameters and Calculate the Reliability Function in light of the Presence of Some Skew in the Data, there must be a Distribution that has flexibility in dealing with that Data. As in the data of Diyala Company for Electrical Industries, as it was observed that there was a positive twisting in the data collected from the Power and Machinery Department, which required distribution that deals with those data and searches for methods that accommodate this problem and lead to accurate estimates of the reliability function,

... Show More
View Publication Preview PDF
Crossref
Publication Date
Tue Oct 01 2019
Journal Name
Journal Of Engineering
Effect of laser process an inclined surface cutting of mild steel then analysis data statistically by RSM
...Show More Authors

The regression analysis process is used to study and predicate the surface response by using the design of experiment (DOE) as well as roughness calculation through developing a mathematical model. In this study; response surface methodology and the particular solution technique are used. Design of experiment used a series of the structured statistical analytic approach to investigate the relationship between some parameters and their responses. Surface roughness is one of the important parameters which play an important role. Also, its found that the cutting speed can result in small effects on surface roughness. This work is focusing on all considerations to make interaction between the parameters (position of influenc

... Show More
View Publication Preview PDF
Crossref (1)
Crossref
Publication Date
Thu Nov 02 2017
Journal Name
Ibn Al-haitham Journal For Pure And Applied Sciences
A Study of Adsorption of Zr (IV) on Manganese Dioxide
...Show More Authors

The adsorption of zirconium, on manganese dioxide from nitric

acid solutions has been studied as a function of shaking time, concentration  of  electrolytes,  concentration  of  adsorbate  and temperature effects (25- 90°C).

Four   hours   of   shaking  was   appropriate  to   ensure   that  the

adsorption plateau was reached and the adsorption of zirconium decrease with an increase in nitric acid concentration. The limiting adsorption  capacities at 3 molar nitric acid was 0.2 Zr per mole of Mn02.   Working  at  elevated   temperature  was  in  favour  

... Show More
View Publication Preview PDF