A stochastic process {Xk, k = 1, 2, ...} is a doubly geometric stochastic process if there exists the ratio (a > 0) and the positive function (h(k) > 0), so that {α 1 h-k }; k ak X k = 1, 2, ... is a generalization of a geometric stochastic process. This process is stochastically monotone and can be used to model a point process with multiple trends. In this paper, we use nonparametric methods to investigate statistical inference for doubly geometric stochastic processes. A graphical technique for determining whether a process is in agreement with a doubly geometric stochastic process is proposed. Further, we can estimate the parameters a, b, μ and σ2 of the doubly geometric stochastic process by using the least squares estimate for Xk and ln Xk, as well as the linear regression method, where μ and σ2 are the mean and variance of X1, respectively. A real-world example is used to demonstrate the process. Furthermore, the estimators' output is evaluated using a real-world example. © 2021 DAV College. All rights reserved.
In earthquake engineering problems, uncertainty exists not only in the seismic excitations but also in the structure's parameters. This study investigates the influence of structural geometry, elastic modulus, mass density, and section dimension uncertainty on the stochastic earthquake response of a multi-story moment resisting frame subjected to random ground motion. The North-south component of the Ali Gharbi earthquake in 2012, Iraq, is selected as ground excitation. Using the power spectral density function (PSD), the two-dimensional finite element model of the moment resisting frame's base motion is modified to account for random ground motion. The probabilistic study of the moment resisting frame structure using stochastic fin
... Show MoreThis work presents a design for a pressure swing adsorption process (PSA) to separate oxygen from air with approximately 95% purity, suitable for different numbers of columns and arrangements. The product refill PSA process was found to perform 33% better (weight of zeolite required or productivity) than the pressure equalization process. The design is based on the adsorption equilibrium of a binary mixture of O2 and N2 for two of the most commonly used adsorbents, 5A & 13X, and extension from a single column approach. Zeolite 13X was found to perform 6% better than zeolite 5A. The most effective variables were determined to be the adsorption step time and the operational pressure. Increasing the adsorption step
... Show MoreAnd the necessity for the progress of modern societies Because the scientific and objective characteristics that characterize modern societies and distinguish them from traditional societies, Is represented by the extent of its innovative achievements in the theoretical, applied and material scientific and spiritual fields. It should be noted that quality and innovation in modern societies is based on two main pillars, Standard measures for measuring and evaluating innovations to achieve their high quality, And the dissemination of the culture of innovation to spread awareness of the importance and conditions of success, and this is done by the advanced industrial countries, However, despite the great disparity between developed industri
... Show MoreThe assessment of data quality from different sources can be considered as a key challenge in supporting effective geospatial data integration and promoting collaboration in mapping projects. This paper presents a methodology for assessing positional and shape quality for authoritative large-scale data, such as Ordnance Survey (OS) UK data and General Directorate for Survey (GDS) Iraq data, and Volunteered Geographic Information (VGI), such as OpenStreetMap (OSM) data, with the intention of assessing possible integration. It is based on the measurement of discrepancies among the datasets, addressing positional accuracy and shape fidelity, using standard procedures and also directional statistics. Line feature comparison has been und
... Show MoreThis paper aims to study the second-order geometric nonlinearity effects of P-Delta on the dynamic response of tall reinforced concrete buildings due to a wide range of earthquake ground motion forces, including minor earthquake up to moderate and strong earthquakes. The frequency domain dynamic analysis procedure was used for response assessment. Reinforced concrete building models with different heights up to 50 stories were analyzed. The finite element software ETABS (version 16.0.3) was used to analyze reinforced concrete building models.
The study reveals that the percentage increase in buildings' sway and drift due to P-Delta effects are nearly constant for specific building height irrespective of the seism
... Show MoreLongitudinal data is becoming increasingly common, especially in the medical and economic fields, and various methods have been analyzed and developed to analyze this type of data.
In this research, the focus was on compiling and analyzing this data, as cluster analysis plays an important role in identifying and grouping co-expressed subfiles over time and employing them on the nonparametric smoothing cubic B-spline model, which is characterized by providing continuous first and second derivatives, resulting in a smoother curve with fewer abrupt changes in slope. It is also more flexible and can pick up on more complex patterns and fluctuations in the data.
The longitudinal balanced data profile was compiled into subgroup
... Show MoreThe present study deals with successive stages of productive
operations happened to produce a production within each stage befo re it moves to the next one. ll cou ld be deduced that this study is an extension to what bas been mentioned in (1 ) .ln (I), the optimum distribution of di!Terent jobs of workers and machines in the productive operations has been st ud ied whi le the study invol ves the optimum schedule for the succession of these operations presuming that thay have already been distributed on machines and workers (2).A mathematical form has been put for this study to define the "Object.ive Function "
... Show MoreAbstract The wavelet shrink estimator is an attractive technique when estimating the nonparametric regression functions, but it is very sensitive in the case of a correlation in errors. In this research, a polynomial model of low degree was used for the purpose of addressing the boundary problem in the wavelet reduction in addition to using flexible threshold values in the case of Correlation in errors as it deals with those transactions at each level separately, unlike the comprehensive threshold values that deal with all levels simultaneously, as (Visushrink) methods, (False Discovery Rate) method, (Improvement Thresholding) and (Sureshrink method), as the study was conducted on real monthly data represented in the rates of theft crimes f
... Show MoreThe aim of the research was to investigate the use of non-parametric tests in the analysis of the questionnaire and how to choose the appropriate test for testing the hypothesis of the study of crime motives in Khartoum State. The data were collected through the primary sources by designing a questionnaire and distributed to a sample of inmates in Khartoum state; the data were analysis by SPSS program using the analytical statistical method through using some of the suitable non-parametric tests for each case. The most important results of the research were: there was significant relationship between the type of crime and the age group therefore, we found that the age group (20-29) was the most frequent crime particularly, the fi
... Show More