The problem of Bi-level programming is to reduce or maximize the function of the target by having another target function within the constraints. This problem has received a great deal of attention in the programming community due to the proliferation of applications and the use of evolutionary algorithms in addressing this kind of problem. Two non-linear bi-level programming methods are used in this paper. The goal is to achieve the optimal solution through the simulation method using the Monte Carlo method using different small and large sample sizes. The research reached the Branch Bound algorithm was preferred in solving the problem of non-linear two-level programming this is because the results were better.
A sensitivity-turbidimetric method at (0-180o) was used for detn. of mebeverine in drugs by two solar cell and six source with C.F.I.A.. The method was based on the formation of ion pair for the pinkish banana color precipitate by the reaction of Mebeverine hydrochloride with Phosphotungstic acid. Turbidity was measured via the reflection of incident light that collides on the surface particles of precipitated at 0-180o. All variables were optimized. The linearity ranged of Mebeverine hydrochloride was 0.05-12.5mmol.L-1, the L.D. (S/N= 3)(3SB) was 521.92 ng/sample depending on dilution for the minimum concentration , with correlation coefficient r = 0.9966while was R.S.D%
... Show MoreA new design of manifold flow injection (FI) coupling with a merging zone technique was studied for sulfamethoxazole determination spectrophotometrically. The semiautomated FI method has many advantages such as being fast, simple, highly accurate, economical with high throughput . The suggested method based on the production of the orange- colored compound of SMZ with (NQS)1,2-Naphthoquinone-4-Sulphonic acid Sodium salt in alkaline media NaOH at λmax 496nm.The linearity range of sulfamethoxazole was 3-100 μg. mL-1, with (LOD) was 0.593 μg. mL-1 and the RSD% is about 1.25 and the recovery is 100.73%. All various physical and chemical parameters that have an effect on the stability and development of
... Show MoreThe aim of this research is to use robust technique by trimming, as the analysis of maximum likelihood (ML) often fails in the case of outliers in the studied phenomenon. Where the (MLE) will lose its advantages because of the bad influence caused by the Outliers. In order to address this problem, new statistical methods have been developed so as not to be affected by the outliers. These methods have robustness or resistance. Therefore, maximum trimmed likelihood: (MTL) is a good alternative to achieve more results. Acceptability and analogies, but weights can be used to increase the efficiency of the resulting capacities and to increase the strength of the estimate using the maximum weighted trimmed likelihood (MWTL). In order to perform t
... Show MoreSome cases of common fixed point theory for classes of generalized nonexpansive maps are studied. Also, we show that the Picard-Mann scheme can be employed to approximate the unique solution of a mixed-type Volterra-Fredholm functional nonlinear integral equation.
The aim of this study is to use style programming goal and technical programming goal fuzzy to study assessing need annual accurately and correctly depending on the data and information about the quantity the actual use of medicines and medical supplies in all hospitals and health institutions during a certain period where they were taking the company public for the marketing of medicines and medical supplies sample for research. Programming model was built goal to this problem, which included (15) variable decision, (19) constraint and two objectives:
1 - rational exchange of budget allocated for medicines and supplies.
2 - ensure that the needs of patients of medicines and supplies needed to improve
The transmitting and receiving of data consume the most resources in Wireless Sensor Networks (WSNs). The energy supplied by the battery is the most important resource impacting WSN's lifespan in the sensor node. Therefore, because sensor nodes run from their limited battery, energy-saving is necessary. Data aggregation can be defined as a procedure applied for the elimination of redundant transmissions, and it provides fused information to the base stations, which in turn improves the energy effectiveness and increases the lifespan of energy-constrained WSNs. In this paper, a Perceptually Important Points Based Data Aggregation (PIP-DA) method for Wireless Sensor Networks is suggested to reduce redundant data before sending them to the
... Show MoreIn this research, we dealt with the study of the Non-Homogeneous Poisson process, which is one of the most important statistical issues that have a role in scientific development as it is related to accidents that occur in reality, which are modeled according to Poisson’s operations, because the occurrence of this accident is related to time, whether with the change of time or its stability. In our research, this clarifies the Non-Homogeneous hemispheric process and the use of one of these models of processes, which is an exponentiated - Weibull model that contains three parameters (α, β, σ) as a function to estimate the time rate of occurrence of earthquakes in Erbil Governorate, as the governorate is adjacent to two countr
... Show MoreThe purpose of this paper is use the Dynamic Programming to solve a deterministic periodic review model for inventory problem and then to find the optimal policies that the company must uses in the purchase or production (in the practical application example the Al Aksa company purchase the generators from out side country).
In this paper, the effective computational method (ECM) based on the standard monomial polynomial has been implemented to solve the nonlinear Jeffery-Hamel flow problem. Moreover, novel effective computational methods have been developed and suggested in this study by suitable base functions, namely Chebyshev, Bernstein, Legendre, and Hermite polynomials. The utilization of the base functions converts the nonlinear problem to a nonlinear algebraic system of equations, which is then resolved using the Mathematica®12 program. The development of effective computational methods (D-ECM) has been applied to solve the nonlinear Jeffery-Hamel flow problem, then a comparison between the methods has been shown. Furthermore, the maximum
... Show MoreThe research involves preparing gold nanoparticles (AuNPs) and studying the factors that influence the shape, sizes and distribution ratio of the prepared particles according to Turkevich method. These factors include (reaction temperature, initial heating, concentration of gold ions, concentration and quantity of added citrate, reaction time and order of reactant addition). Gold nanoparticles prepared were characterized by the following measurements: UV-Visible spectroscopy, X-ray diffraction and scanning electron microscopy. The average size of gold nanoparticles was formed in the range (20 -35) nm. The amount of added citrate was changed and studied. In addition, the concentration of added gold ions was changed and the calibration cur
... Show More