0
In this paper, an estimate has been made for parameters and the reliability function for Transmuted power function (TPF) distribution through using some estimation methods as proposed new technique for white, percentile, least square, weighted least square and modification moment methods. A simulation was used to generate random data that follow the (TPF) distribution on three experiments (E1 , E2 , E3) of the real values of the parameters, and with sample size (n=10,25,50 and 100) and iteration samples (N=1000), and taking reliability times (0< t < 0) . Comparisons have been made between the obtained results from the estimators using mean square error (MSE). The results showed the
... Show MoreThis paper presents a new algorithm in an important research field which is the semantic word similarity estimation. A new feature-based algorithm is proposed for measuring the word semantic similarity for the Arabic language. It is a highly systematic language where its words exhibit elegant and rigorous logic. The score of sematic similarity between two Arabic words is calculated as a function of their common and total taxonomical features. An Arabic knowledge source is employed for extracting the taxonomical features as a set of all concepts that subsumed the concepts containing the compared words. The previously developed Arabic word benchmark datasets are used for optimizing and evaluating the proposed algorithm. In this paper,
... Show MoreIn this study, simple, low cost, precise and speed spectrophotometric methods development for evaluation of sulfacetamide sodium are described. The primary approach contains conversion of sulfacetamide sodium to diazonium salt followed by a reaction with p-cresol as a reagent in the alkaline media. The colored product has an orange colour with absorbance at λmax 450 nm. At the concentration range of (5.0-100 µg.mL-1), the Beer̆ s Low is obeyed with correlation coefficient (R2= 0.9996), limit of detection as 0.2142 µg.mL-1, limit of quantification as 0.707 µg.mL-1 and molar absorptivity as 1488.249 L.mol-1.cm-1. The other approach, cloud point extraction w
... Show MoreThe research involved a rapid, automated and highly accurate developed CFIA/MZ technique for estimation of phenylephrine hydrochloride (PHE) in pure, dosage forms and biological sample. This method is based on oxidative coupling reaction of 2,4-dinitrophenylhydrazine (DNPH) with PHE in existence of sodium periodate as oxidizing agent in alkaline medium to form a red colored product at ʎmax )520 nm (. A flow rate of 4.3 mL.min-1 using distilled water as a carrier, the method of FIA proved to be as a sensitive and economic analytical tool for estimation of PHE.
Within the concentration range of 5-300 μg.mL-1, a calibration curve was rectilinear, where the detection limit was 3.252 μg.mL
The problem of Bi-level programming is to reduce or maximize the function of the target by having another target function within the constraints. This problem has received a great deal of attention in the programming community due to the proliferation of applications and the use of evolutionary algorithms in addressing this kind of problem. Two non-linear bi-level programming methods are used in this paper. The goal is to achieve the optimal solution through the simulation method using the Monte Carlo method using different small and large sample sizes. The research reached the Branch Bound algorithm was preferred in solving the problem of non-linear two-level programming this is because the results were better.
The importance of our research is that it examines the causes and sources of the security challenges in the internal security environment of the GCC countries, and aims to address the most important issues that are of great interest, namely, the issue of inter-GCC differences and addressing the issues of regional security for the Gulf region, After it is one of the most dynamic and more polarized areas for the emergence of threats and challenges because of the multiplicity of sources of threat and their complexity due to the specificity of the strategic environment and the negative repercussions it can have on the Gulf region, especially the issue of regional security of the Gulf Cooperation Council Which has become a magnet for competing i
... Show MoreThe origin of this technique lies in the analysis of François Kenai (1694-1774), the leader of the School of Naturalists, presented in Tableau Economique. This method was developed by Karl Marx in his analysis of the Departmental Relationships and the nature of these relations in the models of " "He said. The current picture of this type of economic analysis is credited to the Russian economist Vasily Leontif. This analytical model is commonly used in developing economic plans in developing countries (p. 1, p. 86). There are several types of input and output models, such as static model, mobile model, regional models, and so on. However, this research will be confined to the open-ended model, which found areas in practical application.
... Show MoreThe research aims to know the impact of the innovative matrix strategy and the problem tree strategy in teaching mathematics to intermediate grade female students on mathematical proficiency. To achieve the research objectives, an experimental approach and a quasi-experimental design were used for two equivalent experimental groups. The first is studied according to the innovative matrix strategy, the second group is studied according to the problem tree strategy. The research sample consisted of (32) female students of the first intermediate grade, who were intentionally chosen after ensuring their equivalence, taking into several factors, most notably (chronological age, previous achievement, and intelligence test). The research tools con
... Show MoreImage compression is a suitable technique to reduce the storage space of an image, increase the area of storage in the device, and speed up the transmission process. In this paper, a new idea for image compression is proposed to improve the performance of the Absolute Moment Block Truncation Coding (AMBTC) method depending on Weber's law condition to distinguish uniform blocks (i.e., low and constant details blocks) from non-uniform blocks in original images. Then, all elements in the bitmap of each uniform block are represented by zero. After that, the lossless method, which is Run Length method, is used for compressing the bits more, which represent the bitmap of these uniform blocks. Via this simple idea, the result is improving
... Show More