Exploring the B-Spline Transform for Estimating Lévy Process Parameters: Applications in Finance and Biomodeling Exploring the B-Spline Transform for Estimating Lévy Process Parameters: Applications in Finance and Biomodeling Letters in Biomathematics · Jul 7, 2025Letters in Biomathematics · Jul 7, 2025 Show publication This paper, presents the application of the B-spline transform as an effective and precise technique for estimating key parameters i.e., drift, volatility, and jump intensity for Lévy processes. Lévy processes are powerful tools for representing phenomena with continuous trends with abrupt changes. The proposed approach is validated through a simulated biological case study on animal migration in which movements are modeled as Lévy flights with long-range jumps and directionally biased drift. This scenario depicts real-world stochastic behaviors in the spatial dynamics of a species. The results demonstrate the power of the B-spline method in its capability to accommodate complex stochastic behaviors with low mean squared error (MSE). To demonstrate its relevance in an actual financial context, the model is applied to forecast trends in Iraqi ATM usage based on data collected between the years 2008 and 2021. The results indicate a uniform growth in demand, supported by forecasts for the years 2022 and 2023, confirming the model’s predictive accuracy. Overall, the research identifies the B-spline transform as a robust method for parameter estimation in Lévy-based models with potential applications in finance, ecology, and biomathematics.
Home New Trends in Information and Communications Technology Applications Conference paper Audio Compression Using Transform Coding with LZW and Double Shift Coding Zainab J. Ahmed & Loay E. George Conference paper First Online: 11 January 2022 126 Accesses Part of the Communications in Computer and Information Science book series (CCIS,volume 1511) Abstract The need for audio compression is still a vital issue, because of its significance in reducing the data size of one of the most common digital media that is exchanged between distant parties. In this paper, the efficiencies of two audio compression modules were investigated; the first module is based on discrete cosine transform and the second module is based on discrete wavelet tr
... Show MoreThe denoising of a natural image corrupted by Gaussian noise is a problem in signal or image processing. Much work has been done in the field of wavelet thresholding but most of it was focused on statistical modeling of wavelet coefficients and the optimal choice of thresholds. This paper describes a new method for the suppression of noise in image by fusing the stationary wavelet denoising technique with adaptive wiener filter. The wiener filter is applied to the reconstructed image for the approximation coefficients only, while the thresholding technique is applied to the details coefficients of the transform, then get the final denoised image is obtained by combining the two results. The proposed method was applied by usin
... Show MoreImage quality plays a vital role in improving and assessing image compression performance. Image compression represents big image data to a new image with a smaller size suitable for storage and transmission. This paper aims to evaluate the implementation of the hybrid techniques-based tensor product mixed transform. Compression and quality metrics such as compression-ratio (CR), rate-distortion (RD), peak signal-to-noise ratio (PSNR), and Structural Content (SC) are utilized for evaluating the hybrid techniques. Then, a comparison between techniques is achieved according to these metrics to estimate the best technique. The main contribution is to improve the hybrid techniques. The proposed hybrid techniques are consisting of discrete wavel
... Show MoreToday, there are large amounts of geospatial data available on the web such as Google Map (GM), OpenStreetMap (OSM), Flickr service, Wikimapia and others. All of these services called open source geospatial data. Geospatial data from different sources often has variable accuracy due to different data collection methods; therefore data accuracy may not meet the user requirement in varying organization. This paper aims to develop a tool to assess the quality of GM data by comparing it with formal data such as spatial data from Mayoralty of Baghdad (MB). This tool developed by Visual Basic language, and validated on two different study areas in Baghdad / Iraq (Al-Karada and Al- Kadhumiyah). The positional accuracy was asses
... Show MoreThe study aims to measure and evaluate the return and the risk formulas of Islamic finance of Jordan during the period (2000 – 2009) according of increasing importance of these banks in recent and coming years to face challenges to maximize returns and minimize risks through financing with Islamic formula to investigate of existence statistical significant relationship between returns and risking Islamic bank , has been use of financial other statistical measurement. Measuring return and risk of Islamic banks have not been widely considered ,except in few descriptive studies . The controversy among academic and professionals about hot to measure and evaluate a comprehe
... Show MoreOne of the most important problems in the statistical inference is estimating parameters and Reliability parameter and also interval estimation , and testing hypothesis . estimating two parameters of exponential distribution and also reliability parameter in a stress-strength model.
This parameter deals with estimating the scale parameter and the Location parameter µ , of two exponential distribution ,using moments estimator and maximum likelihood estimator , also we estimate the parameter R=pr(x>y), where x,y are two- parameter independent exponential random variables .
Statistical properties of this distribution and its properti
... Show MoreThe estimation of the parameters of linear regression is based on the usual Least Square method, as this method is based on the estimation of several basic assumptions. Therefore, the accuracy of estimating the parameters of the model depends on the validity of these hypotheses. The most successful technique was the robust estimation method which is minimizing maximum likelihood estimator (MM-estimator) that proved its efficiency in this purpose. However, the use of the model becomes unrealistic and one of these assumptions is the uniformity of the variance and the normal distribution of the error. These assumptions are not achievable in the case of studying a specific problem that may include complex data of more than one model. To
... Show MoreAbstract
The Non - Homogeneous Poisson process is considered as one of the statistical subjects which had an importance in other sciences and a large application in different areas as waiting raws and rectifiable systems method , computer and communication systems and the theory of reliability and many other, also it used in modeling the phenomenon that occurred by unfixed way over time (all events that changed by time).
This research deals with some of the basic concepts that are related to the Non - Homogeneous Poisson process , This research carried out two models of the Non - Homogeneous Poisson process which are the power law model , and Musa –okumto , to estimate th
... Show MoreThe problem of finding the cyclic decomposition (c.d.) for the groups ), where prime upper than 9 is determined in this work. Also, we compute the Artin characters (A.ch.) and Artin indicator (A.ind.) for the same groups, we obtain that after computing the conjugacy classes, cyclic subgroups, the ordinary character table (o.ch.ta.) and the rational valued character table for each group.