This paper introduces a non-conventional approach with multi-dimensional random sampling to solve a cocaine abuse model with statistical probability. The mean Latin hypercube finite difference (MLHFD) method is proposed for the first time via hybrid integration of the classical numerical finite difference (FD) formula with Latin hypercube sampling (LHS) technique to create a random distribution for the model parameters which are dependent on time [Formula: see text]. The LHS technique gives advantage to MLHFD method to produce fast variation of the parameters’ values via number of multidimensional simulations (100, 1000 and 5000). The generated Latin hypercube sample which is random or non-deterministic in nature is further integrated with the FD method to complete one cycle of LHS-FD simulation iteration. This process is repeated until [Formula: see text] final iterations of LHS-FD are obtained. The means of these [Formula: see text] final solutions (MLHFD solutions) are tabulated, graphed and analyzed. The numerical simulation results of MLHFD for the SEIR model are presented side-by-side with deterministic solutions obtained from the classical FD scheme and homotopy analysis method with Pade approximation (HAM-Pade). The present MLHFD results are also compared with the previous non-deterministic statistical estimations from 1995 to 2015. Good agreement between the two is perceived with small errors. MLHFD method can be used to predict future behavior, range and prediction interval for the epidemic model solutions. The expected profiles of the cocaine abuse subpopulations are projected until the year 2045. Both the statistical estimations and the deterministic results of FD and HAM-Pade are found to be within the MLHFD prediction intervals for all the years and for all the subpopulations considered.
Computer vision seeks to mimic the human visual system and plays an essential role in artificial intelligence. It is based on different signal reprocessing techniques; therefore, developing efficient techniques becomes essential to achieving fast and reliable processing. Various signal preprocessing operations have been used for computer vision, including smoothing techniques, signal analyzing, resizing, sharpening, and enhancement, to reduce reluctant falsifications, segmentation, and image feature improvement. For example, to reduce the noise in a disturbed signal, smoothing kernels can be effectively used. This is achievedby convolving the distributed signal with smoothing kernels. In addition, orthogonal moments (OMs) are a cruc
... Show MoreIn this paper the experimentally obtained conditions for the fusion splicing with photonic crystal fibers (PCF) having large mode areas were reported. The physical mechanism of the splice loss and the microhole collapse property of photonic crystal fiber (PCF) were studied. By controlling the arc-power and the arc-time of a conventional electric arc fusion splicer (FSM-60S), the minimum loss of splicing for fusion two conventional single mode fibers (SMF-28) was (0.00dB), which has similar mode field diameter. For splicing PCF (LMA-10) with a conventional single mode fiber (SMF-28), the loss was increased due to the mode field mismatch.
In recent years, Bitcoin has become the most widely used blockchain platform in business and finance. The goal of this work is to find a viable prediction model that incorporates and perhaps improves on a combination of available models. Among the techniques utilized in this paper are exponential smoothing, ARIMA, artificial neural networks (ANNs) models, and prediction combination models. The study's most obvious discovery is that artificial intelligence models improve the results of compound prediction models. The second key discovery was that a strong combination forecasting model that responds to the multiple fluctuations that occur in the bitcoin time series and Error improvement should be used. Based on the results, the prediction acc
... Show More. In recent years, Bitcoin has become the most widely used blockchain platform in business and finance. The goal of this work is to find a viable prediction model that incorporates and perhaps improves on a combination of available models. Among the techniques utilized in this paper are exponential smoothing, ARIMA, artificial neural networks (ANNs) models, and prediction combination models. The study's most obvious discovery is that artificial intelligence models improve the results of compound prediction models. The second key discovery was that a strong combination forecasting model that responds to the multiple fluctuations that occur in the bitcoin time series and Error improvement should be used. Based on the results, the prediction a
... Show MoreIn this paper the use of a circular array antenna with adaptive system in conjunction with modified Linearly Constrained Minimum Variance Beam forming (LCMVB) algorithm is proposed to meet the requirement of Angle of Arrival (AOA) estimation in 2-D as well as the Signal to Noise Ratio (SNR) of estimated sources (Three Dimensional 3-D estimation), rather than interference cancelation as it is used for. The proposed system was simulated, tested and compared with the modified Multiple Signal Classification (MUSIC) technique for 2-D estimation. The results show the system has exhibited astonishing results for simultaneously estimating 3-D parameters with accuracy approximately equivalent to the MUSIC technique (for estimating elevation and a
... Show MoreIn this study, we investigate the run length properties for the EWMA charts with time - varying control limits, and fast initial Response (FIR), for monitoring the mean of a normal process with a known standard deviation , by using non - homogeneous markov chain approach.
Compression is the reduction in size of data in order to save space or transmission time. For data transmission, compression can be performed on just the data content or on the entire transmission unit (including header data) depending on a number of factors. In this study, we considered the application of an audio compression method by using text coding where audio compression represented via convert audio file to text file for reducing the time to data transfer by communication channel. Approach: we proposed two coding methods are applied to optimizing the solution by using CFG. Results: we test our application by using 4-bit coding algorithm the results of this method show not satisfy then we proposed a new approach to compress audio fil
... Show MoreThe purpose of this paper is to develop a hybrid conceptual model for building information modelling (BIM) adoption in facilities management (FM) through the integration of the technology task fit (TTF) and the unified theory of acceptance and use of technology (UTAUT) theories. The study also aims to identify the influence factors of BIM adoption and usage in FM and identify gaps in the existing literature and to provide a holistic picture of recent research in technology acceptance and adoption in the construction industry and FM sector.
In earthquake engineering problems, uncertainty exists not only in the seismic excitations but also in the structure's parameters. This study investigates the influence of structural geometry, elastic modulus, mass density, and section dimension uncertainty on the stochastic earthquake response of a multi-story moment resisting frame subjected to random ground motion. The North-south component of the Ali Gharbi earthquake in 2012, Iraq, is selected as ground excitation. Using the power spectral density function (PSD), the two-dimensional finite element model of the moment resisting frame's base motion is modified to account for random ground motion. The probabilistic study of the moment resisting frame structure using stochastic fin
... Show MoreThree-dimensional cavity was investigated numerical in the current study filled with porous medium from a saturated fluid. The problem configuration consists of two insulated bottom and right wall and left vertical wall maintained at constant temperatures at variable locations, using two discretized heaters. The porous cavity fluid motion was represented by the momentum equation generalized model. The present investigation thermophysical parameters included the local thermal equilibrium condition. The isotherms and streamlines was used to examine energy transport and momentum. The meaning of changing parameters on the established average Nusselt number, temperature and velocity distribution are highlighted and discussed.