This paper study two stratified quantile regression models of the marginal and the conditional varieties. We estimate the quantile functions of these models by using two nonparametric methods of smoothing spline (B-spline) and kernel regression (Nadaraya-Watson). The estimates can be obtained by solve nonparametric quantile regression problem which means minimizing the quantile regression objective functions and using the approach of varying coefficient models. The main goal is discussing the comparison between the estimators of the two nonparametric methods and adopting the best one between them
Background: Errors of horizontal condylar inclinations and Bennett angles had largely affected the articulation of teeth and the pathways of cusps. The aim of this study was to estimate and compare between the horizontal condylar (protrusive) angles and Bennett angles of full mouth rehabilitation patients using two different articulator systems. Materials and Methods: Protrusive angles and Bennett angles of 50 adult males and females Iraqi TMD-free full mouth rehabilitation patients were estimated by using two different articulator systems. Arbitrary hinge axis location followed by protrusive angles and Bennett angles, estimation was done by a semiadjustable articulator system. A fully adjustable articulator system was utilized to locate th
... Show More
The logistic regression model of the most important regression models a non-linear which aim getting estimators have a high of efficiency, taking character more advanced in the process of statistical analysis for being a models appropriate form of Binary Data.
Among the problems that appear as a result of the use of some statistical methods I
... Show MoreThis study looks into the many methods that are used in the risk assessment procedure that is used in the construction industry nowadays. As a result of the slow adoption of novel assessment methods, professionals frequently resort to strategies that have previously been validated as being successful. When it comes to risk assessment, having a precise analytical tool that uses the cost of risk as a measurement and draws on the knowledge of professionals could potentially assist bridge the gap between theory and practice. This step will examine relevant literature, sort articles according to their published year, and identify domains and qualities. Consequently, the most significant findings have been presented in a manne
... Show MoreAbstract:
Robust statistics Known as, resistance to errors caused by deviation from the stability hypotheses of the statistical operations (Reasonable, Approximately Met, Asymptotically Unbiased, Reasonably Small Bias, Efficient ) in the data selected in a wide range of probability distributions whether they follow a normal distribution or a mixture of other distributions deviations different standard .
power spectrum function lead to, President role in the analysis of Stationary random processes, form stable random variables organized according to time, may be discrete random variables or continuous. It can be described by measuring its total capacity as function in frequency.
<
... Show More
Research topic: (The Epistemological Foundations for Comparison of Religions by al-Amiri)
The research sought to study the topic with: a descriptive methodology by investigating the components of al-Amiri's approach to the interfaith comparison. And analytical, by showing the applied perception of an objective model in the comparison of religions to answer two questions: What are the cognitive foundations of al-Amiri? And what is his approach to establishing an objective comparison between religions?
The research started by introducing Abu al-Hassan al-Amiri, and then presented four topics: An introduction to al-Amiri's efforts in the interfaith comparison, his knowledge foundations, an applied model
... Show MoreThis study was aimed to investigate the response surface methodology (RSM) to evaluate the effects of various experimental conditions on the removal of levofloxacin (LVX) from the aqueous solution by means of electrocoagulation (EC) technique with stainless steel electrodes. The EC process was achieved successfully with the efficiency of LVX removal of 90%. The results obtained from the regression analysis, showed that the data of experiential are better fitted to the polynomial model of second-order with the predicted correlation coefficient (pred. R2) of 0.723, adjusted correlation coefficient (Adj. R2) of 0.907 and correlation coefficient values (R2) of 0.952. This shows that the predicted models and experimental values are in go
... Show MoreThis paper including a gravitational lens time delays study for a general family of lensing potentials, the popular singular isothermal elliptical potential (SIEP), and singular isothermal elliptical density distribution (SIED) but allows general angular structure. At first section there is an introduction for the selected observations from the gravitationally lensed systems. Then section two shows that the time delays for singular isothermal elliptical potential (SIEP) and singular isothermal elliptical density distributions (SIED) have a remarkably simple and elegant form, and that the result for Hubble constant estimations actually holds for a general family of potentials by combining the analytic results with data for the time dela
... Show MoreIn this research, the covariance estimates were used to estimate the population mean in the stratified random sampling and combined regression estimates. were compared by employing the robust variance-covariance matrices estimates with combined regression estimates by employing the traditional variance-covariance matrices estimates when estimating the regression parameter, through the two efficiency criteria (RE) and mean squared error (MSE). We found that robust estimates significantly improved the quality of combined regression estimates by reducing the effect of outliers using robust covariance and covariance matrices estimates (MCD, MVE) when estimating the regression parameter. In addition, the results of the simulation study proved
... Show MoreThis study aims to conduct an exhaustive comparison between the performance of human translators and artificial intelligence-powered machine translation systems, specifically examining the top three systems: Spider-AI, Metacate, and DeepL. A variety of texts from distinct categories were evaluated to gain a profound understanding of the qualitative differences, as well as the strengths and weaknesses, between human and machine translations. The results demonstrated that human translation significantly outperforms machine translation, with larger gaps in literary texts and texts characterized by high linguistic complexity. However, the performance of machine translation systems, particularly DeepL, has improved and in some contexts
... Show Morethis research aims at a number of objectives including Developing the tax examination process and raise its efficiency without relying on comprehensive examination method using some statistical methods in the tax examination and Discussing the most important concepts related to the statistical methods used in the tax examination and showing its importance and how they are applied. the research represents an applied study in the General Commission of taxes. In order to achieve its objectives the research has used in the theoretical side the descriptive approach (analytical), and in the practical side Some statistical methods applied to the sample of the final accounts for the contracting company (limited) and the pharmaceutical industry (
... Show More