The current research creates an overall relative analysis concerning the estimation of Meixner process parameters via the wavelet packet transform. Of noteworthy presentation relevance, it compares the moment method and the wavelet packet estimator for the four parameters of the Meixner process. In this paper, the research focuses on finding the best threshold value using the square root log and modified square root log methods with the wavelet packets in the presence of noise to enhance the efficiency and effectiveness of the denoising process for the financial asset market signal. In this regard, a simulation study compares the performance of moment estimation and wavelet packets for different sample sizes. The results show that wavelet packets are the best performers when associated with a robust threshold strategy for estimation. Illustrated below is the applicability of the proposed method by a real-world application from the foreign exchange market, emphasizing the use of wavelet packets for parameter estimation and the potential for improvement under stochastic modeling and analysis.
Researchers employ behavior based malware detection models that depend on API tracking and analyzing features to identify suspected PE applications. Those malware behavior models become more efficient than the signature based malware detection systems for detecting unknown malwares. This is because a simple polymorphic or metamorphic malware can defeat signature based detection systems easily. The growing number of computer malwares and the detection of malware have been the concern for security researchers for a large period of time. The use of logic formulae to model the malware behaviors is one of the most encouraging recent developments in malware research, which provides alternatives to classic virus detection methods. To address the l
... Show MoreWith the escalation of cybercriminal activities, the demand for forensic investigations into these crimeshas grown significantly. However, the concept of systematic pre-preparation for potential forensicexaminations during the software design phase, known as forensic readiness, has only recently gainedattention. Against the backdrop of surging urban crime rates, this study aims to conduct a rigorous andprecise analysis and forecast of crime rates in Los Angeles, employing advanced Artificial Intelligence(AI) technologies. This research amalgamates diverse datasets encompassing crime history, varioussocio-economic indicators, and geographical locations to attain a comprehensive understanding of howcrimes manifest within the city. Lev
... Show MoreBiomass has been extensively investigated, because of its presence as clean energy source. Tars and particulates formation problems are still the major challenges in development especially in the implementation of gasification technologies into nowadays energy supply systems. Laser Induced Fluorescence spectroscopy (LIF) method is incorporated for determining aromatic and Polycyclic Aromatic Hydrocarbons (PAH) produced at high temperature gasification technology. The effect of tars deposition when the gases are cooled has been highly reduced by introducing a new concept of measurement cell. The samples of PAH components have been prepared with the standard constrictions of measured PAHs by using gas chromatograph (GC). OPO laser with tun
... Show MoreIs in this research review of the way minimum absolute deviations values based on linear programming method to estimate the parameters of simple linear regression model and give an overview of this model. We were modeling method deviations of the absolute values proposed using a scale of dispersion and composition of a simple linear regression model based on the proposed measure. Object of the work is to find the capabilities of not affected by abnormal values by using numerical method and at the lowest possible recurrence.
لقد كان حرص المؤلف على إصدار هذا الكتاب نابعا ً من قناعة تامة بأن مجال التقويم والقياس بحاجة إلى كتاب علمي حديث يتناول عرض أدوات الاختبار والقياس والمتمثلة بالصدق والثبات ويتسم بالوضوح في التعبير عن المفاهيم والمصطلحات والأنواع لكل منها ليكون وسيلة مبسطة بأيدي الأساتذة والباحثين وطلبتي الدراسات العليا الماجستير والدكتوراه لإستخراج صدق وثبات الاختبارات والمقاييس بطرق إحصائية متقدمة من خلال إستخدام البرنا
... Show MoreIn this study, we used Bayesian method to estimate scale parameter for the normal distribution. By considering three different prior distributions such as the square root inverted gamma (SRIG) distribution and the non-informative prior distribution and the natural conjugate family of priors. The Bayesian estimation based on squared error loss function, and compared it with the classical estimation methods to estimate the scale parameter for the normal distribution, such as the maximum likelihood estimation and th
... Show MoreIn this paper, we used maximum likelihood method and the Bayesian method to estimate the shape parameter (θ), and reliability function (R(t)) of the Kumaraswamy distribution with two parameters l , θ (under assuming the exponential distribution, Chi-squared distribution and Erlang-2 type distribution as prior distributions), in addition to that we used method of moments for estimating the parameters of the prior distributions. Bayes
Most available methods for unit hydrographs (SUH) derivation involve manual, subjective fitting of
a hydrograph through a few data points. The use of probability distributions for the derivation of synthetic
hydrographs had received much attention because of its similarity with unit hydrograph properties. In this
paper, the use of two flexible probability distributions is presented. For each distribution the unknown
parameters were derived in terms of the time to peak(tp), and the peak discharge(Qp). A simple Matlab
program is prepared for calculating these parameters and their validity was checked using comparison
with field data. Application to field data shows that the gamma and lognormal distributions had fit well.<
The increasing availability of computing power in the past two decades has been use to develop new techniques for optimizing solution of estimation problem. Today's computational capacity and the widespread availability of computers have enabled development of new generation of intelligent computing techniques, such as our interest algorithm, this paper presents one of new class of stochastic search algorithm (known as Canonical Genetic' Algorithm ‘CGA’) for optimizing the maximum likelihood function strategy is composed of three main steps: recombination, mutation, and selection. The experimental design is based on simulating the CGA with different values of are compared with those of moment method. Based on MSE value obtained from bot
... Show More