Interval methods for verified integration of initial value problems (IVPs) for ODEs have been used for more than 40 years. For many classes of IVPs, these methods have the ability to compute guaranteed error bounds for the flow of an ODE, where traditional methods provide only approximations to a solution. Overestimation, however, is a potential drawback of verified methods. For some problems, the computed error bounds become overly pessimistic, or integration even breaks down. The dependency problem and the wrapping effect are particular sources of overestimations in interval computations. Berz (see [1]) and his co-workers have developed Taylor model methods, which extend interval arithmetic with symbolic computations. The latter is an effective tool for reducing both the dependency problem and the wrapping effect. By construction, Taylor model methods appear particularly suitable for integrating nonlinear ODEs. In this paper, we analyze Taylor model based integration of ODEs and compare Taylor model with traditional enclosure methods for IVPs for ODEs. More advanced Taylor model integration methods are discussed in the algorithm (1). For clarity, we summarize the major steps of the naive Taylor model method as algorithm 1.
In this paper, a subspace identification method for bilinear systems is used . Wherein a " three-block " and " four-block " subspace algorithms are used. In this algorithms the input signal to the system does not have to be white . Simulation of these algorithms shows that the " four-block " gives fast convergence and the dimensions of the matrices involved are significantly smaller so that the computational complexity is lower as a comparison with " three-block " algorithm .
The aim of this research is to estimate the parameters of the linear regression model with errors following ARFIMA model by using wavelet method depending on maximum likelihood and approaching general least square as well as ordinary least square. We use the estimators in practical application on real data, which were the monthly data of Inflation and Dollar exchange rate obtained from the (CSO) Central Statistical organization for the period from 1/2005 to 12/2015. The results proved that (WML) was the most reliable and efficient from the other estimators, also the results provide that the changing of fractional difference parameter (d) doesn’t effect on the results.
Abstract
The Non - Homogeneous Poisson process is considered as one of the statistical subjects which had an importance in other sciences and a large application in different areas as waiting raws and rectifiable systems method , computer and communication systems and the theory of reliability and many other, also it used in modeling the phenomenon that occurred by unfixed way over time (all events that changed by time).
This research deals with some of the basic concepts that are related to the Non - Homogeneous Poisson process , This research carried out two models of the Non - Homogeneous Poisson process which are the power law model , and Musa –okumto , to estimate th
... Show Moreطريقة سهلة وبسيطة ودقيقة لتقدير السبروفلوكساسين في وجود السيفاليكسين او العكس بالعكس في خليط منهما. طبقت الطريقة المقترحة بطريقة الاضافة القياسية لنقطة بنجاح في تقدير السبروفلوكساسين بوجود السيفاليكسين كمتداخل عند الاطوال الموجية 240-272.3 نانوميتر وبتراكيز مختلفة من السبروفلوكساسين 4-18 مايكروغرام . مل-1 وكذلك تقدير السيفاليكسين بوجود السبروفلوكساسين الذي يتداخل باطوال موجية 262-285.7 نانوميتر وبتراكيز مخ
... Show MoreThis paper aims to propose a hybrid approach of two powerful methods, namely the differential transform and finite difference methods, to obtain the solution of the coupled Whitham-Broer-Kaup-Like equations which arises in shallow-water wave theory. The capability of the method to such problems is verified by taking different parameters and initial conditions. The numerical simulations are depicted in 2D and 3D graphs. It is shown that the used approach returns accurate solutions for this type of problems in comparison with the analytic ones.
In this study, the modified size-strain plot (SSP) method was used to analyze the x-ray diffraction lines pattern of diffraction lines (1 0 1), (1 2 1), (2 0 2), (0 4 2), (2 4 2) for the calcium titanate(CaTiO3) nanoparticles, and to calculate lattice strain, crystallite size, stress, and energy density, using three models: uniform (USDM). With a lattice strain of (2.147201889), a stress of (0.267452615X10), and an energy density of (2.900651X10-3 KJ/m3), the crystallite was 32.29477611 nm in size, and to calculate lattice strain of Scherrer (4.1644598X10−3), and (1.509066023X10−6 KJ/m3), a stress of(6.403949183X10−4MPa) and (26.019894 nm).
The size and the concentration of the gold nanoparticles (GNPs)
synthesized in double distilled deionized water (DDDW) have been
found to be affected by the laser energy and the number of pulses.
The absorption spectra of the nanoparticles DDDW, and the
surface plasmon resonance (SPR) peaks were measured, and found to
be located between (509 and 524)nm using the UV- Vis
spectrophotometer. SPR calculations, images of transmission
electron microscope, and dynamic light scattering (DLS) method
were used to determine the size of GNPs, which found to be ranged
between (3.5 and 27) nm. The concentrations of GNPs in colloidal
solutions found to be ranged between (37 and 142) ppm, and
measured by atomic absorptio
The splicing design of the existing road and the new road in the expansion project is an important part of the design work. Based on the analysis of the characteristics and the load effect of pavement structure on splicing, this paper points out that tensile crack or shear failure may occur at the splicing under the repeated action of the traffic load on the new/old pavement. According to the current structure design code of asphalt pavement in China, it is proposed that the horizontal tensile stress at the bottom of the splicing layer and the vertical shear stress at other layers of the splicing line should be controlled by adjusting the position and size of the excavated steps in addition to the conventional design ind
... Show MoreIn recent decades, the identification of faces with and without masks from visual data, such as video and still images, has become a captivating research subject. This is primarily due to the global spread of the Corona pandemic, which has altered the appearance of the world and necessitated the use of masks as a vital measure for epidemic prevention. Intellectual development based on artificial intelligence and computers plays a decisive role in the issue of epidemic safety, as the topic of facial recognition and identifying individuals who wear masks or not was most prominent in the introduction and in-depth education. This research proposes the creation of an advanced system capable of accurately identifying faces, both with and
... Show More