A stochastic process {Xk, k = 1, 2, ...} is a doubly geometric stochastic process if there exists the ratio (a > 0) and the positive function (h(k) > 0), so that {α 1 h-k }; k ak X k = 1, 2, ... is a generalization of a geometric stochastic process. This process is stochastically monotone and can be used to model a point process with multiple trends. In this paper, we use nonparametric methods to investigate statistical inference for doubly geometric stochastic processes. A graphical technique for determining whether a process is in agreement with a doubly geometric stochastic process is proposed. Further, we can estimate the parameters a, b, μ and σ2 of the doubly geometric stochastic process by using the least squares estimate for Xk and ln Xk, as well as the linear regression method, where μ and σ2 are the mean and variance of X1, respectively. A real-world example is used to demonstrate the process. Furthermore, the estimators' output is evaluated using a real-world example. © 2021 DAV College. All rights reserved.
Kidney tumors are of different types having different characteristics and also remain challenging in the field of biomedicine. It becomes very important to detect the tumor and classify it at the early stage so that appropriate treatment can be planned. Accurate estimation of kidney tumor volume is essential for clinical diagnoses and therapeutic decisions related to renal diseases. The main objective of this research is to use the Computer-Aided Diagnosis (CAD) algorithms to help the early detection of kidney tumors that addresses the challenges of accurate kidney tumor volume estimation caused by extensive variations in kidney shape, size and orientation across subjects.
In this paper, have tried to implement an automated segmentati
Abstract
The Non - Homogeneous Poisson process is considered as one of the statistical subjects which had an importance in other sciences and a large application in different areas as waiting raws and rectifiable systems method , computer and communication systems and the theory of reliability and many other, also it used in modeling the phenomenon that occurred by unfixed way over time (all events that changed by time).
This research deals with some of the basic concepts that are related to the Non - Homogeneous Poisson process , This research carried out two models of the Non - Homogeneous Poisson process which are the power law model , and Musa –okumto , to estimate th
... Show MoreLow grade crude palm oil (LGCPO) presents as an attractive option as feedstock for biodiesel production due to its low cost and non-competition with food resources. Typically, LGCPO contains high contents of free fatty acids (FFA), rendering it impossible in direct trans-esterification processes due to the saponification reaction. Esterification is the typical pre-treatment process to reduce the FFA content and to produce fatty acid methyl ester (FAME). The pre-treatment of LGCPO using two different acid catalysts, such as titanium oxysulphate sulphuric acid complex hydrate (TiOSH) and 5-sulfosalicylic acid dihydrate (5-SOCAH) was investigated for the first time in this study. The optimum conditions for the homogenous catalyst (5-SOCAH) wer
... Show MoreThis study focused on treatment of real wastewater rejected from leather industry in Al-Nahrawan city in Iraq by Electrocoagulation (EC) process followed by Reverse Osmosis (RO) process. The successive treatment was applied due to high concentration of Cr3+ ions (about 1600 ppm) rejected in wastewater of this industry and for applying EC with moderate power consumption and better results of produced water. In Electrocoagulation process (EC), the effect of NaCl concentration (1.5, 3 g/l), current density (C.D.) (15-25 mA/cm2), electrolysis time (1-2 h), and distance between electrodes (E.D.) (1-2 cm) were examined in a batch cell by implementing Taguchi experimental design. According to the results obtained from multiple regression and signa
... Show MoreIn some cases, researchers need to know the causal effect of the treatment in order to know the extent of the effect of the treatment on the sample in order to continue to give the treatment or stop the treatment because it is of no use. The local weighted least squares method was used to estimate the parameters of the fuzzy regression discontinuous model, and the local polynomial method was used to estimate the bandwidth. Data were generated with sample sizes (75,100,125,150 ) in repetition 1000. An experiment was conducted at the Innovation Institute for remedial lessons in 2021 for 72 students participating in the institute and data collection. Those who used the treatment had an increase in their score after
... Show MorePorosity and permeability are the most difficult properties to determine in subsurface reservoir characterization. The difficulty of estimating them arising from the fact that porosity and permeability may vary significantly over the reservoir volume, and can only be sampled at well location. Secondly, the porosity values are commonly evaluated from the well log data, which are usually available from most wells in the reservoir, but permeability values, which are generally determined from core analysis, are not usually available. The aim of this study is: First, to develop correlations between the core and the well log data which can be used to estimate permeability in uncored wells, these correlations enable to estimate reservoir permeabil
... Show MoreIn this paper, we propose a method using continuous wavelets to study the multivariate fractional Brownian motion through the deviations of the transformed random process to find an efficient estimate of Hurst exponent using eigenvalue regression of the covariance matrix. The results of simulations experiments shown that the performance of the proposed estimator was efficient in bias but the variance get increase as signal change from short to long memory the MASE increase relatively. The estimation process was made by calculating the eigenvalues for the variance-covariance matrix of Meyer’s continuous wavelet details coefficients.
A new approach presented in this study to determine the optimal edge detection threshold value. This approach is base on extracting small homogenous blocks from unequal mean targets. Then, from these blocks we generate small image with known edges (edges represent the lines between the contacted blocks). So, these simulated edges can be assumed as true edges .The true simulated edges, compared with the detected edges in the small generated image is done by using different thresholding values. The comparison based on computing mean square errors between the simulated edge image and the produced edge image from edge detector methods. The mean square error computed for the total edge image (Er), for edge regio
... Show MoreIn this paper, we propose a method using continuous wavelets to study the multivariate fractional Brownian motion through the deviations of the transformed random process to find an efficient estimate of Hurst exponent using eigenvalue regression of the covariance matrix. The results of simulations experiments shown that the performance of the proposed estimator was efficient in bias but the variance get increase as signal change from short to long memory the MASE increase relatively. The estimation process was made by calculating the eigenvalues for the variance-covariance matrix of Meyer’s continuous wavelet details coefficients.