This paper introduces a non-conventional approach with multi-dimensional random sampling to solve a cocaine abuse model with statistical probability. The mean Latin hypercube finite difference (MLHFD) method is proposed for the first time via hybrid integration of the classical numerical finite difference (FD) formula with Latin hypercube sampling (LHS) technique to create a random distribution for the model parameters which are dependent on time [Formula: see text]. The LHS technique gives advantage to MLHFD method to produce fast variation of the parameters’ values via number of multidimensional simulations (100, 1000 and 5000). The generated Latin hypercube sample which is random or non-deterministic in nature is further integrated with the FD method to complete one cycle of LHS-FD simulation iteration. This process is repeated until [Formula: see text] final iterations of LHS-FD are obtained. The means of these [Formula: see text] final solutions (MLHFD solutions) are tabulated, graphed and analyzed. The numerical simulation results of MLHFD for the SEIR model are presented side-by-side with deterministic solutions obtained from the classical FD scheme and homotopy analysis method with Pade approximation (HAM-Pade). The present MLHFD results are also compared with the previous non-deterministic statistical estimations from 1995 to 2015. Good agreement between the two is perceived with small errors. MLHFD method can be used to predict future behavior, range and prediction interval for the epidemic model solutions. The expected profiles of the cocaine abuse subpopulations are projected until the year 2045. Both the statistical estimations and the deterministic results of FD and HAM-Pade are found to be within the MLHFD prediction intervals for all the years and for all the subpopulations considered.
Many of the key stream generators which are used in practice are LFSR-based in the sense that they produce the key stream according to a rule y = C(L(x)), where L(x) denotes an internal linear bit stream, produced by small number of parallel linear feedback shift registers (LFSRs), and C denotes some nonlinear compression function. In this paper we combine between the output sequences from the linear feedback shift registers with the sequences out from non linear key generator to get the final very strong key sequence
Abstract
The problem of missing data represents a major obstacle before researchers in the process of data analysis in different fields since , this problem is a recurrent one in all fields of study including social , medical , astronomical and clinical experiments .
The presence of such a problem within the data to be studied may influence negatively on the analysis and it may lead to misleading conclusions , together with the fact that these conclusions that result from a great bias caused by that problem in spite of the efficiency of wavelet methods but they are also affected by the missing of data , in addition to the impact of the problem of miss of accuracy estimation
... Show MoreThis paper proposes two hybrid feature subset selection approaches based on the combination (union or intersection) of both supervised and unsupervised filter approaches before using a wrapper, aiming to obtain low-dimensional features with high accuracy and interpretability and low time consumption. Experiments with the proposed hybrid approaches have been conducted on seven high-dimensional feature datasets. The classifiers adopted are support vector machine (SVM), linear discriminant analysis (LDA), and K-nearest neighbour (KNN). Experimental results have demonstrated the advantages and usefulness of the proposed methods in feature subset selection in high-dimensional space in terms of the number of selected features and time spe
... Show MoreSampling is the selection of a representative portion of a material, and it’s as important as testing. The minimum weight of gravel field or lab sample depends on the nominal maximum particle size. The weight of the sample will always be greater than that portion required for testing. The approximate precision desired for the testing will control the weight of the gravel sample. In this study, gravel sample has been simulated by using multilinear approximated function for Fuller’s curve on the logarithmic scale. Gravel particles are divided into classes according to their medium diameter and each class was simulated separately. A stochastic analysis, by using 100 realizations in s
Fine aggregate (Sand) is a necessary material used in concrete construction purposes, it’s naturally available and it’s widely used around the world for different parts of construction in any building mainly for filling the voids between gravel. Sand gradation is important for different composite materials, and it gives good cohesion when compared with coarse sand that provides strength for the building. Therefore, sand is necessary to be tested before it is used and mixed with other building materials in construction and the specimen must be selected carefully to represent the real material in the field. The specimen weight must be larger than the required weight for test. When t
The unconventional techniques called “the quick look techniques”, have been developed to present well log data calculations, so that they may be scanned easily to identify the zones that warrant a more detailed analysis, these techniques have been generated by service companies at the well site which are among the useful, they provide the elements of information needed for making decisions quickly when time is of essence. The techniques used in this paper are:
- Apparent resistivity Rwa
- Rxo /Rt
The above two methods had been used to evaluate Nasiriyah oil field formations (well-NS-3) to discover the hydrocarbon bearing formations. A compu
... Show MoreThe aim of this study is to develop a novel framework for managing risks in smart supply chains by enhancing business continuity and resilience against potential disruptions. This research addresses the growing uncertainty in supply chain environments, driven by both natural phenomena-such as pandemics and earthquakes—and human-induced events, including wars, political upheavals, and societal transformations. Recognizing that traditional risk management approaches are insufficient in such dynamic contexts, the study proposes an adaptive framework that integrates proactive and remedial measures for effective risk mitigation. A fuzzy risk matrix is employed to assess and analyze uncertainties, facilitating the identification of disr
... Show MoreObjective: The present study aims to assess the stressful life events for patients with substance abuse in Baghdad city.
Methodology: A descriptive study was carried out at (Baghdad teaching hospital and Ibn-Rushed Psychiatric hospital).
Starting from 1
st of December 2012 to 3
rd of July 2013, A non-probability (purposive) sample of 64 patients that
diagnosed with substance abuse, the data were collected through the use of semi-structured interview by
questionnaire, which consists of three parts sociodemographic data, medical information, and Life events scale
consists of 49-items distributed to six domains including, family and social domain, health domain, security, legal and
criminal domain, work and school do