This study discussed a biased estimator of the Negative Binomial Regression model known as (Liu Estimator), This estimate was used to reduce variance and overcome the problem Multicollinearity between explanatory variables, Some estimates were used such as Ridge Regression and Maximum Likelihood Estimators, This research aims at the theoretical comparisons between the new estimator (Liu Estimator) and the estimators of Maximum Likelihood (ML) and Ridge Regression (RR) by using the mean square error (MSE) criterion, where the variance of the Maximum Likelihood (ML) comes in the presence of the problem Multicollinearity between the explanatory variables. In this study, the Monte Carlo simulation was designed to evaluate the performance of estimations using the criterion for comparison, the mean square error (MSE). The simulation results showed important an estimated Liu and superior to the RR and MLE estimator Where the number of explanatory variables is (p=5) and the sample size is (n=100), where the number of explanatory variables is (p=3) and for all sizes, and also when (p=5) for all sizes except size (n=100), the RR regression method is the best.
Cost is the essence of any production process for it is one of the requirements for the continuity of activities so as to increase the profitability of the economic unit and to support the competitive situation in the market. Therefore, there should be an overall control to reduce the cost without compromising the product quality; to achieve this, the management should have detailed credible and reliable information about the cost to be measured, collected, understood and to analyze the causes for the spread of deviations and obstacles the management faces, and to search for the factors that trigger the emergence of these deviations and obstacles
This research deals with the use of a number of statistical methods, such as the kernel method, watershed, histogram and cubic spline, to improve the contrast of digital images. The results obtained according to the RSME and NCC standards have proven that the spline method is the most accurate in the results compared to other statistical methods
The flow in a manifolds considered as an advanced problem in hydraulic engineering applications. The objectives of this study are to determine; the uniformity qn/q1 (ratio of the discharge at last outlet, qn to the discharge at first outlet, q1) and total head losses of the flow along straight and rectangular loop manifolds with different flow conditions. The straight pipes were with 18 m and 19 m long and with of 25.4 mm (1.0 in) in diameter each. While, the rectangular close loop configuration was with length of 19 m and with diameter of 25.4 mm (1.0 in) also. Constant head in the supply tank was used and the head is 2.10 m. It is found that outlets spacing and manifold configuration are the main factors aff
... Show MoreLibraries, information centers, and everything related to organizing and preparing information need to be periodically re-evaluated in order to stand on the level of quality, which means improving the general reality of these institutions to ensure sufficient satisfaction from beneficiaries of the services provided. This is what was worked on in this research, as one of the most important quality standards in libraries and information centers, LibQUAL+®, was applied in one of the most important and oldest central university libraries, namely the Central Library of the University of Baghdad at its two locations, Al-Jadriya and Al-Waziriya. The sample of beneficiaries to whom the questionnaire was distributed reached 75 beneficiaries distrib
... Show MoreThe concept of intertextuality was one of the problems that occupied the attention of critics and critics in targeting the structure of textual intertextuality between texts and their overlap in the process of producing meaning. Until intertextuality became a stable term and it can be monitored in the structure of the theatrical text and determining the mechanisms of this intertextuality between texts through fields and classifications agreed upon by the most important critics who wrote and considered intertextuality. Perhaps our previous research (the approach of exposure in the epistemological hallway to intertextuality) was an attempt to interview a terminology, which the researcher intended to monitor, through the mechanisms of inter
... Show MoreAtenolol was used with ammonium molybdate to prove the efficiency, reliability and repeatability of the long distance chasing photometer (NAG-ADF-300-2) using continuous flow injection analysis. The method is based on reaction between atenolol and ammonium molybdate in an aqueous medium to obtain a dark brown precipitate. Optimum parameters was studied to increase the sensitivity for developed method. A linear range for calibration graph was 0.1-3.5 mmol/L for cell A and 0.3-3.5 mmol/L for cell B, and LOD 133.1680 ng/100 µL and 532.6720 ng/100 µL for cell A and cell B respectively with correlation coefficient (r) 0.9910 for cell A and 0.9901 for cell B, RSD% was lower than 1%, (n=8) for the determination of ate
... Show MoreConcentrations 25, 50 and 100 mg of nano-capsules linolenic acid and non-capsulated fatty acid for 1kg of Milk was used for yogurt manufacture. The results showed no significant differences in the ratio of titration acidity and pH values between all processed treatments at the beginning and during of period storage. The treatments was added to it coated omega-3 by nano method were the least exposed to the oxidation process from the non-capsules omega-3, And for shield of The poly lactic acid had a significant role in the protection of alpha-linolenic acid against lipolysis by the formation of a protective layer to protect the acid from the activity of lipases enzymes, and the addition of fatty acid linolenic to milk was determined the gr
... Show MoreTwitter data analysis is an emerging field of research that utilizes data collected from Twitter to address many issues such as disaster response, sentiment analysis, and demographic studies. The success of data analysis relies on collecting accurate and representative data of the studied group or phenomena to get the best results. Various twitter analysis applications rely on collecting the locations of the users sending the tweets, but this information is not always available. There are several attempts at estimating location based aspects of a tweet. However, there is a lack of attempts on investigating the data collection methods that are focused on location. In this paper, we investigate the two methods for obtaining location-based dat
... Show MoreThe study aims to provide a Suggested model for the application of Virtual Private Network is a tool that used to protect the transmitted data through the Web-based information system, and the research included using case study methodology in order to collect the data about the research area ( Al-Rasheed Bank) by using Visio to design and draw the diagrams of the suggested models and adopting the data that have been collected by the interviews with the bank's employees, and the research used the modulation of data in order to find solutions for the research's problem.
The importance of the study Lies in dealing with one of the vital topics at the moment, namely, how to make the information transmitted via
... Show MoreAbstract:
This research aims to identify the impact of the Layout Ghazi al-Hariri hospital for surgery specialist on customer satisfaction (patients) using the model (Servicescape), the problem of the research represented in the extent to which the hospital management design of the service and Layout hospital aesthetic and functional aspects that fit patients for therapeutic and nursing services , and used the developer scale by (Miles et al., 2012) for data collection, which includes the independent variable in (17) items distributed in three dimensions (Facility aesthetics , hospital cleanliness, and the Layout accessibility ) The dependent variable is the satisfaction of customers (pat
... Show More