Interval methods for verified integration of initial value problems (IVPs) for ODEs have been used for more than 40 years. For many classes of IVPs, these methods have the ability to compute guaranteed error bounds for the flow of an ODE, where traditional methods provide only approximations to a solution. Overestimation, however, is a potential drawback of verified methods. For some problems, the computed error bounds become overly pessimistic, or integration even breaks down. The dependency problem and the wrapping effect are particular sources of overestimations in interval computations. Berz (see [1]) and his co-workers have developed Taylor model methods, which extend interval arithmetic with symbolic computations. The latter is an effective tool for reducing both the dependency problem and the wrapping effect. By construction, Taylor model methods appear particularly suitable for integrating nonlinear ODEs. In this paper, we analyze Taylor model based integration of ODEs and compare Taylor model with traditional enclosure methods for IVPs for ODEs. More advanced Taylor model integration methods are discussed in the algorithm (1). For clarity, we summarize the major steps of the naive Taylor model method as algorithm 1.
In this paper the modified trapezoidal rule is presented for solving Volterra linear Integral Equations (V.I.E) of the second kind and we noticed that this procedure is effective in solving the equations. Two examples are given with their comparison tables to answer the validity of the procedure.
The Assignment model is a mathematical model that aims to express a real problem facing factories and companies which is characterized by the guarantee of its activity in order to make the appropriate decision to get the best allocation of machines or jobs or workers on machines in order to increase efficiency or profits to the highest possible level or reduce costs or time To the extent possible, and in this research has been using the method of labeling to solve the problem of the fuzzy assignment of real data has been approved by the tire factory Diwaniya, where the data included two factors are the factors of efficiency and cost, and was solved manually by a number of iterations until reaching the optimization solution,
... Show MoreThe primary objective of this paper is to improve a biometric authentication and classification model using the ear as a distinct part of the face since it is unchanged with time and unaffected by facial expressions. The proposed model is a new scenario for enhancing ear recognition accuracy via modifying the AdaBoost algorithm to optimize adaptive learning. To overcome the limitation of image illumination, occlusion, and problems of image registration, the Scale-invariant feature transform technique was used to extract features. Various consecutive phases were used to improve classification accuracy. These phases are image acquisition, preprocessing, filtering, smoothing, and feature extraction. To assess the proposed
... Show MoreABSTRACTBackground: cochlear implants are electronic devices that convert sound energy into electrical signals to stimulate ganglion cells and cochlear nerve fibers. These devices are indicated for patients with severe to profound sensorineural hearing losses who receive little or no benefit from hearing aids. The implant basically takes over the function of the cochlear hair cells. The implant consists of external components (microphone, speech processor and transmitting coil) and internal components (receiver stimulator and electrode array). The implant is inserted via a trans mastoid facial recess approach to the round window and scala tympani.Objectives: to determine the effectiveness and safety of non fixation method in cochlear imp
... Show MoreThis work was conducted to study the extraction of eucalyptus oil from natural plants (Eucalyptus camaldulensis leaves) using water distillation method by Clevenger apparatus. The effects of main operating parameters were studied: time to reach equilibrium, temperature (70 to100°C), solvent to solid ratio (4:1 to 8:1 (v/w)), agitation speed (0 to 900 rpm), and particle size (0.5 to 2.5 cm) of the fresh leaves, to find the best processing conditions for achieving maximum oil yield. The results showed that the agitation speed of 900 rpm, temperature 100° C, with solvent to solid ratio 5:1 (v/w) of particle size 0.5 cm for 160 minute give the highest percentage of oil (46.25 wt.%). The extracted oil was examined by HPLC.
A simple, rapid and sensitive spectrophotometric method has been developed for the determination of captopril in aqueous solution. The method is based on reaction of captopril with 2,3-dichloro 1,4- naphthoquinon(Dichlone) in neutral medium to form a stable yellow colored product which shows maximum absorption at 347 nm with molar absorptivity of 5.6 ×103 L.mole-1. cm-1. The proposed method is applied successfully for determination of captopril in commercial pharmaceutical tablets.
Simulation of the Linguistic Fuzzy Trust Model (LFTM) over oscillating Wireless Sensor Networks (WSNs) where the goodness of the servers belonging to them could change along the time is presented in this paper, and the comparison between the outcomes achieved with LFTM model over oscillating WSNs with the outcomes obtained by applying the model over static WSNs where the servers maintaining always the same goodness, in terms of the selection percentage of trustworthy servers (the accuracy of the model) and the average path length are also presented here. Also in this paper the comparison between the LFTM and the Bio-inspired Trust and Reputation Model for Wireless Sensor Network
... Show MoreThe theory of probabilistic programming may be conceived in several different ways. As a method of programming it analyses the implications of probabilistic variations in the parameter space of linear or nonlinear programming model. The generating mechanism of such probabilistic variations in the economic models may be due to incomplete information about changes in demand, production and technology, specification errors about the econometric relations presumed for different economic agents, uncertainty of various sorts and the consequences of imperfect aggregation or disaggregating of economic variables. In this Research we discuss the probabilistic programming problem when the coefficient bi is random variable
... Show More