The main purpose of the work is to apply a new method, so-called LTAM, which couples the Tamimi and Ansari iterative method (TAM) with the Laplace transform (LT). This method involves solving a problem of non-fatal disease spread in a society that is assumed to have a fixed size during the epidemic period. We apply the method to give an approximate analytic solution to the nonlinear system of the intended model. Moreover, the absolute error resulting from the numerical solutions and the ten iterations of LTAM approximations of the epidemic model, along with the maximum error remainder, were calculated by using MATHEMATICA® 11.3 program to illustrate the effectiveness of the method.
This research includes the use of an artificial intelligence algorithm, which is one of the algorithms of biological systems which is the algorithm of genetic regulatory networks (GRNs), which is a dynamic system for a group of variables representing space within time. To construct this biological system, we use (ODEs) and to analyze the stationarity of the model we use Euler's method. And through the factors that affect the process of gene expression in terms of inhibition and activation of the transcription process on DNA, we will use TF transcription factors. The current research aims to use the latest methods of the artificial intelligence algorithm. To apply Gene Regulation Networks (GRNs), we used a progr
... Show MoreIn the current research, multiple mixing ratios of gamma -transitions of the energy levels 60𝑁𝑑 142−150 isotopes populated in 𝑁𝑑(𝑛, 𝑛 ˊ 60 142−150 ) 60𝑁𝑑 142−150 interaction are calculated using the constant statistical tensor (CST) method. The results obtained are, in general, in good agreement or consistent, within the experimental error, with the results published in the previously researches. Existing discrepancies result from inaccuracies in the experimental results of previous works. The current results confirm the validity of the constant statistical tenser method of calculating the values of mixing ratios and its predictability of errors in experimental results
Now that most of the conventional reservoirs are being depleted at a rapid pace, the focus is on unconventional reservoirs like tight gas reservoirs. Due to the heterogeneous nature and low permeability of unconventional reservoirs, they require a huge number of wells to hit all the isolated hydrocarbon zones. Infill drilling is one of the most common and effective methods of increasing the recovery, by reducing the well spacing and increasing the sweep efficiency. However, the problem with drilling such a large number of wells is the determination of the optimum location for each well that ensures minimum interference between wells, and accelerates the recovery from the field. Detail
Regression models are one of the most important models used in modern studies, especially research and health studies because of the important results they achieve. Two regression models were used: Poisson Regression Model and Conway-Max Well- Poisson), where this study aimed to make a comparison between the two models and choose the best one between them using the simulation method and at different sample sizes (n = 25,50,100) and with repetitions (r = 1000). The Matlab program was adopted.) to conduct a simulation experiment, where the results showed the superiority of the Poisson model through the mean square error criterion (MSE) and also through the Akaiki criterion (AIC) for the same distribution.
Paper type:
... Show MoreThe ground state densities of unstable neutron-rich 8He and 17B exotic nuclei are studied via the framework of the two-frequency shell model (TFSM) and the binary cluster model (BCM). In TFSM, the single particle harmonic oscillator wave functions are used with two different oscillator size parameters βc and βv where the former is for the core (inner) orbits and the latter is for the valence (halo) orbits. In BCM, the internal densities of the clusters are described by single particle Gaussian wave functions. Shell model calculations for the two valence neutrons in 8He and 17B are performed via the computer code OXBASH. The long tail performance is clearly noticed in the calculated neutron and matter density distributions of these nucl
... Show MoreA theoretical model is developed to determine time evolution of temperature at the surface of an opaque target placed in air for cases characterized by the formation of laser supported absorption waves (LSAW) plasmas. The model takes into account the power temporal variation throughout an incident laser pulse, (i.e. pulse shape, or simply: pulse profile).
Three proposed profiles are employed and results are compared with the square pulse approximation of a constant power.
Abstract
Objective of this research focused on testing the impact of internal corporate governance instruments in the management of working capital and the reflection of each of them on the Firm performance. For this purpose, four main hypotheses was formulated, the first, pointed out its results to a significant effect for each of corporate major shareholders ownership and Board of Directors size on the net working capital and their association with a positive relation. The second, explained a significant effect of net working capital on the economic value added, and their link inverse relationship, while the third, explored a significant effect for each of the corporate major shareholders ownershi
... Show MoreThe comparison of double informative priors which are assumed for the reliability function of Pareto type I distribution. To estimate the reliability function of Pareto type I distribution by using Bayes estimation, will be used two different kind of information in the Bayes estimation; two different priors have been selected for the parameter of Pareto type I distribution . Assuming distribution of three double prior’s chi- gamma squared distribution, gamma - erlang distribution, and erlang- exponential distribution as double priors. The results of the derivaties of these estimators under the squared error loss function with two different double priors. Using the simulation technique, to compare the performance for
... Show MoreAbstract
We produced a study in Estimation for Reliability of the Exponential distribution based on the Bayesian approach. These estimates are derived using Bayesian approaches. In the Bayesian approach, the parameter of the Exponential distribution is assumed to be random variable .we derived bayes estimators of reliability under four types when the prior distribution for the scale parameter of the Exponential distribution is: Inverse Chi-squar
... Show More