This article aims to explore the importance of estimating the a semiparametric regression function ,where we suggest a new estimator beside the other combined estimators and then we make a comparison among them by using simulation technique . Through the simulation results we find that the suggest estimator is the best with the first and second models ,wherealse for the third model we find Burman and Chaudhuri (B&C) is best.
NiO nanoparticle synthesis by chemical method and characterized by XRD with crystal size 11.72
nm and grain size 13 nm from FESEM image also NiO micro used ,two NiO as an additive to evaluate the
possibility of producing photodegradable polymers, the practical application of solid-phase photocatalytic
degradation of polyvinyl chloride (PVC- NiO composite films) was investigated. PVC has a negative impact
on the environment since its polymer degrades slowly, yet it has a wide range of industrial applications and
the amount used shows no evidence of diminishing use. Thus, a synthesis of modified PVC- NiO micro and
nano has been studied with 0, 50, 100, 150, 200, 250, and 300 (hours) as irradiation time a
The current work is characterized by simplicity, accuracy and high sensitivity Dispersive liquid - Liquid Micro Extraction (DLLME). The method was developed to determine Telmesartan (TEL) and Irbesartan (IRB) in the standard and pharmaceutical composition. Telmesartan and Irbesartan are separated prior to treatment with Eriochrom black T as a reagent and formation ion pair reaction dye. The analytical results of DLLME method for linearity range (0.2- 6.0) mg /L for both drugs, molar absorptivity were (1.67 × 105- 5.6 × 105) L/ mole. cm, limit of detection were (0.0242and0.0238), Limit of quantification were (0.0821and0.0711), the Distribution coefficient were
... Show MoreThis research deals with a shrinking method concernes with the principal components similar to that one which used in the multiple regression “Least Absolute Shrinkage and Selection: LASS”. The goal here is to make an uncorrelated linear combinations from only a subset of explanatory variables that may have a multicollinearity problem instead taking the whole number say, (K) of them. This shrinkage will force some coefficients to equal zero, after making some restriction on them by some "tuning parameter" say, (t) which balances the bias and variance amount from side, and doesn't exceed the acceptable percent explained variance of these components. This had been shown by MSE criterion in the regression case and the percent explained v
... Show MoreBackground: Despite the fact that asthma is a long-term disease that may be treated, many people are unable to control their symptoms due to a lack of knowledge about their condition. The study's purpose was to find out if a pharmacist intervention improved asthma management because of this.
Objective: this study designed to assess the effect of pharmaceutical care on pulmonary functions test.
Method: The study was completed in three months. The patients who were enrolled were divided into two groups: Group 1 consists of 23 asthma patients who were randomly assigned to receive conventional therapy for chronic bronchial asthma based on disease stage and se
... Show MoreIt is the regression analysis is the foundation stone of knowledge of statistics , which mostly depends on the ordinary least square method , but as is well known that the way the above mentioned her several conditions to operate accurately and the results can be unreliable , add to that the lack of certain conditions make it impossible to complete the work and analysis method and among those conditions are the multi-co linearity problem , and we are in the process of detected that problem between the independent variables using farrar –glauber test , in addition to the requirement linearity data and the lack of the condition last has been resorting to the
... Show MoreIn this paper, a new method of selection variables is presented to select some essential variables from large datasets. The new model is a modified version of the Elastic Net model. The modified Elastic Net variable selection model has been summarized in an algorithm. It is applied for Leukemia dataset that has 3051 variables (genes) and 72 samples. In reality, working with this kind of dataset is not accessible due to its large size. The modified model is compared to some standard variable selection methods. Perfect classification is achieved by applying the modified Elastic Net model because it has the best performance. All the calculations that have been done for this paper are in
Discriminant analysis is a technique used to distinguish and classification an individual to a group among a number of groups based on a linear combination of a set of relevant variables know discriminant function. In this research discriminant analysis used to analysis data from repeated measurements design. We will deal with the problem of discrimination and classification in the case of two groups by assuming the Compound Symmetry covariance structure under the assumption of normality for univariate repeated measures data.
... Show More
Error control schemes became a necessity in network-on-chip (NoC) to improve reliability as the on-chip interconnect errors increase with the continuous shrinking of geometry. Accordingly, many researchers are trying to present multi-bit error correction coding schemes that perform a high error correction capability with the simplest design possible to minimize area and power consumption. A recent work, Multi-bit Error Correcting Coding with Reduced Link Bandwidth (MECCRLB), showed a huge reduction in area and power consumption compared to a well-known scheme, namely, Hamming product code (HPC) with Type-II HARQ. Moreover, the authors showed that the proposed scheme can correct 11 random errors which is considered a high
... Show MoreThe research studied and analyzed the hybrid parallel-series systems of asymmetrical components by applying different experiments of simulations used to estimate the reliability function of those systems through the use of the maximum likelihood method as well as the Bayes standard method via both symmetrical and asymmetrical loss functions following Rayleigh distribution and Informative Prior distribution. The simulation experiments included different sizes of samples and default parameters which were then compared with one another depending on Square Error averages. Following that was the application of Bayes standard method by the Entropy Loss function that proved successful throughout the experimental side in finding the reliability fun
... Show More