Researchers have increased interest in recent years in determining the optimum sample size to obtain sufficient accuracy and estimation and to obtain high-precision parameters in order to evaluate a large number of tests in the field of diagnosis at the same time. In this research, two methods were used to determine the optimum sample size to estimate the parameters of high-dimensional data. These methods are the Bennett inequality method and the regression method. The nonlinear logistic regression model is estimated by the size of each sampling method in high-dimensional data using artificial intelligence, which is the method of artificial neural network (ANN) as it gives a high-precision estimate commensurate with the data type and type of medical study. The probabilistic values obtained from the artificial neural network are used to calculate the net reclassification index (NRI). A program was written for this purpose using the statistical programming language (R), where the mean maximum absolute error criterion (MME) of the net reclassification network index (NRI) was used to compare the methods of specifying the sample size and the presence of the number of different default parameters in light of the value of a specific error margin (ε). To verify the performance of the methods using the comparison criteria above were the most important conclusions were that the Bennett inequality method is the best in determining the optimum sample size according to the number of default parameters and the error margin value
Motivated by the vital role played by transition metal nitride (TMN) composites in various industrial applications, the current study reports electronic properties, thermodynamic stability phase diagram, and vacancy formation energies of the plausible surfaces of NiAs and WC-type structures of δ3-MoN and δ-WN hexagonal phases, respectively. Low miller indices of various surface terminations of δ3-MoN and δ-WN namely, (100), (110), (111), and (001) have been considered. Initial cleaving of δ3-MoN bulk unit cell offers separate Mo and N terminations signified as δ3-MoN (100): Mo, δ3-MoN(100):N, δ3-MoN(111):Mo, δ3-MoN(111):Mo, and δ3-MoN(001):Mo. However, the (110) plane reveals mix-truncated with both molybdenum and nitrogen atoms i
... Show Moreهدف البحث الى بيان طبيعة ارتباط والتأثير بين الضغوط التنافسية (المتغير المستقل) والتجديد الاستراتيجي (المتغير التابع) ، تم تطبيق البحث في فنادق الدرجة الممتازة في بغداد. وبلغت عدد افراد عينة البحث (99) مديراً يعملون في (6) فنادق من الدرجة (الممتازة) ببغداد، وهي (فندق الرشيد، فندق عشتار، فندق ميريديان، فندق المنصور، فندق بابل، وفندق بغداد) وتم اجراء التحليل الاحصائي باستخدام البرنامج الاحصائي AMOS وظهرت وجود تنافسي
... Show MoreOrganizations adopt a number of procedures and instructions in their field of activities in order to aid their resources development and energies to serve their entrepreneurial orientations. This calls for preparing a range of mechanisms to mitigate the strictness and complexity of procedures. The ambiguity and severe complexity of procedures means acknowledging the loss in energy and this in turn impedes the hopes while in the same time weakens the enthusiasm in these organizations and an impedes the possibility to achieve continues innovation, thereby losing opportunities to the level of surrender to the risks and assuming them to be unconquered obstacles.
There
... Show MoreEntropy define as uncertainty measure has been transfared by using the cumulative distribution function and reliability function for the Burr type – xii. In the case of data which suffer from volatility to build a model the probability distribution on every failure of a sample after achieving limitations function, probabilistic distribution. Has been derived formula probability distribution of the new transfer application entropy on the probability distribution of continuous Burr Type-XII and tested a new function and found that it achieved the conditions function probability, been derived mean and function probabilistic aggregate in order to be approved in the generation of data for the purpose of implementation of simulation
... Show MoreThis piece of research deals with assimilation as one of the phonological processes in the language. It is a trial to give more attention to this important process in English language with deep explanation to its counterpart in Arabic. in addition, this study sheds light on the points of similarities and differences concerning this process in the two languages. Assimilation in English means two sounds are involved, and one becomes more like the other.
The assimilating phoneme picks up one or more of the features of another nearby phoneme. The English phoneme /n/ has t
... Show MoreThe present study attempts to give a detailed discussion and analysis of parenthetical constructions in English and Arabic, the aim being to pinpoint the points of similarity and difference between the two languages in this particular linguistic area.The study claims that various types of constructions in English and Arabic could be considered parenthetical; these include non-restrictive relative clauses, non-restrictive appositives, comment clauses, vocatives, interjections, among others. These are going to be identified, classified, and analyzed according to the Quirk grammar - the approach to grammatical description pioneered by Randolph Quirk and his associates, and published in a series of reference grammars during the 1970
... Show MoreThe article aims to study the liquidity that is required to be provided optimally and the profitability that is required to be achieved by the bank, and the impact of both of them on the value of the bank, and their effect of both liquidity and profitability on the value of the bank. Hence, the research problem emerged, which indicates the extent of the effect of liquidity and profitability on the value of the bank. The importance of the research stems from the main role that commercial banks play in the economy of a country. This requires the need to identify liquidity in a broad way and its most important components, and how to
... Show MoreBackground and Aim. Coronary artery disease (CAD) is a major risk factor for the progression to heart failure (HF), which is associated with an increase in left ventricular volume (LVV). This study aims to measure ventricular function and myocardial perfusion imaging markers of the left side of the heart, which can be performed with injection of a 99mTc at stress and rest by using single-photonemission-computed-tomography (SPECT). Subject and methods. The study included 121 patients with CAD, comprising 53 females and 68 males with ages between 25 to 88 years and 265 healthy subjects comprising 84 males and 181 females. All patients and healthy subjects volunteered to participate in this study. They were classified according to
... Show More