This paper introduces a non-conventional approach with multi-dimensional random sampling to solve a cocaine abuse model with statistical probability. The mean Latin hypercube finite difference (MLHFD) method is proposed for the first time via hybrid integration of the classical numerical finite difference (FD) formula with Latin hypercube sampling (LHS) technique to create a random distribution for the model parameters which are dependent on time [Formula: see text]. The LHS technique gives advantage to MLHFD method to produce fast variation of the parameters’ values via number of multidimensional simulations (100, 1000 and 5000). The generated Latin hypercube sample which is random or non-deterministic in nature is further integrated with the FD method to complete one cycle of LHS-FD simulation iteration. This process is repeated until [Formula: see text] final iterations of LHS-FD are obtained. The means of these [Formula: see text] final solutions (MLHFD solutions) are tabulated, graphed and analyzed. The numerical simulation results of MLHFD for the SEIR model are presented side-by-side with deterministic solutions obtained from the classical FD scheme and homotopy analysis method with Pade approximation (HAM-Pade). The present MLHFD results are also compared with the previous non-deterministic statistical estimations from 1995 to 2015. Good agreement between the two is perceived with small errors. MLHFD method can be used to predict future behavior, range and prediction interval for the epidemic model solutions. The expected profiles of the cocaine abuse subpopulations are projected until the year 2045. Both the statistical estimations and the deterministic results of FD and HAM-Pade are found to be within the MLHFD prediction intervals for all the years and for all the subpopulations considered.
Sampling is the selection of a representative portion of a material, and it’s as important as testing. The minimum weight of gravel field or lab sample depends on the nominal maximum particle size. The weight of the sample will always be greater than that portion required for testing. The approximate precision desired for the testing will control the weight of the gravel sample. In this study, gravel sample has been simulated by using multilinear approximated function for Fuller’s curve on the logarithmic scale. Gravel particles are divided into classes according to their medium diameter and each class was simulated separately. A stochastic analysis, by using 100 realizations in s
Many of the key stream generators which are used in practice are LFSR-based in the sense that they produce the key stream according to a rule y = C(L(x)), where L(x) denotes an internal linear bit stream, produced by small number of parallel linear feedback shift registers (LFSRs), and C denotes some nonlinear compression function. In this paper we combine between the output sequences from the linear feedback shift registers with the sequences out from non linear key generator to get the final very strong key sequence
Abstract
The problem of missing data represents a major obstacle before researchers in the process of data analysis in different fields since , this problem is a recurrent one in all fields of study including social , medical , astronomical and clinical experiments .
The presence of such a problem within the data to be studied may influence negatively on the analysis and it may lead to misleading conclusions , together with the fact that these conclusions that result from a great bias caused by that problem in spite of the efficiency of wavelet methods but they are also affected by the missing of data , in addition to the impact of the problem of miss of accuracy estimation
... Show More
In this work, a novel technique to obtain an accurate solutions to nonlinear form by multi-step combination with Laplace-variational approach (MSLVIM) is introduced. Compared with the traditional approach for variational it overcome all difficulties and enable to provide us more an accurate solutions with extended of the convergence region as well as covering to larger intervals which providing us a continuous representation of approximate analytic solution and it give more better information of the solution over the whole time interval. This technique is more easier for obtaining the general Lagrange multiplier with reduces the time and calculations. It converges rapidly to exact formula with simply computable terms wit
... Show MoreThe problem of poverty and deprivation constitute a humanitarian tragedy and its continuation may threaten the political achievements reached by the State. Iraq, in particular, and although he is one of the very rich countries due to availability of huge economic wealth, poverty indicators are still high. In addition, the main factor in the decline in the standard of living due to the weakness of the government's performance in the delivery of public services of water, electricity and sanitation. Thus, the guide for human development has been addressed which express the achievements that the state can be achieved both on a physical level or on the human level, so in order to put appropriate strategies and policies aimed at elimin
... Show MoreThis paper proposes two hybrid feature subset selection approaches based on the combination (union or intersection) of both supervised and unsupervised filter approaches before using a wrapper, aiming to obtain low-dimensional features with high accuracy and interpretability and low time consumption. Experiments with the proposed hybrid approaches have been conducted on seven high-dimensional feature datasets. The classifiers adopted are support vector machine (SVM), linear discriminant analysis (LDA), and K-nearest neighbour (KNN). Experimental results have demonstrated the advantages and usefulness of the proposed methods in feature subset selection in high-dimensional space in terms of the number of selected features and time spe
... Show MoreThe unconventional techniques called “the quick look techniques”, have been developed to present well log data calculations, so that they may be scanned easily to identify the zones that warrant a more detailed analysis, these techniques have been generated by service companies at the well site which are among the useful, they provide the elements of information needed for making decisions quickly when time is of essence. The techniques used in this paper are:
- Apparent resistivity Rwa
- Rxo /Rt
The above two methods had been used to evaluate Nasiriyah oil field formations (well-NS-3) to discover the hydrocarbon bearing formations. A compu
... Show MoreIn this paper, we studied the scheduling of jobs on a single machine. Each of n jobs is to be processed without interruption and becomes available for processing at time zero. The objective is to find a processing order of the jobs, minimizing the sum of maximum earliness and maximum tardiness. This problem is to minimize the earliness and tardiness values, so this model is equivalent to the just-in-time production system. Our lower bound depended on the decomposition of the problem into two subprograms. We presented a novel heuristic approach to find a near-optimal solution for the problem. This approach depends on finding efficient solutions for two problems. The first problem is minimizing total completi
... Show More