Circular data (circular sightings) are periodic data and are measured on the unit's circle by radian or grades. They are fundamentally different from those linear data compatible with the mathematical representation of the usual linear regression model due to their cyclical nature. Circular data originate in a wide variety of fields of scientific, medical, economic and social life. One of the most important statistical methods that represents this data, and there are several methods of estimating angular regression, including teachers and non-educationalists, so the letter included the use of three models of angular regression, two of which are teaching models and one of which is a model of educators. ) (DM) (MLE) and circular shrinkage model (Circular Shrinkage Method) (SH) This method is a method proposed by the researcher, and the non-educational model is the circular positional regression model Local Linear Circular Regression (LL), and the Mean Circular Error (MCE) criterion was used to compare the three models. The results were shown on the experimental side (simulation) using inverse method (inverse method) and using R language software, in simulation experiments (9 experiments) and for all default values, Lack of preference for teacher models compared to non-teacher models.
In this paper, a mathematical model for the oxidative desulfurization of kerosene had been developed. The mathematical model and simulation process is a very important process due to it provides a better understanding of a real process. The mathematical model in this study was based on experimental results which were taken from literature to calculate the optimal kinetic parameters where simulation and optimization were conducted using gPROMS software. The optimal kinetic parameters were Activation energy 18.63958 kJ/mol, Pre-exponential factor 2201.34 (wt)-0.76636. min-1 and the reaction order 1.76636. These optimal kinetic parameters were used to find the optimal reaction conditions which
... Show MoreAbstract
The grey system model GM(1,1) is the model of the prediction of the time series and the basis of the grey theory. This research presents the methods for estimating parameters of the grey model GM(1,1) is the accumulative method (ACC), the exponential method (EXP), modified exponential method (Mod EXP) and the Particle Swarm Optimization method (PSO). These methods were compared based on the Mean square error (MSE) and the Mean Absolute percentage error (MAPE) as a basis comparator and the simulation method was adopted for the best of the four methods, The best method was obtained and then applied to real data. This data represents the consumption rate of two types of oils a he
... Show MoreAn experimental investigation of natural convection heat transfer from an isothermal horizontal,vertical and inclined heated square flat plates with and without circular hole, were carried out in two cases, perforated plates without an impermeable adiabatic hole "open core" and perforated plates with an impermeable adiabatic hole "closed core" by adiabatic plug. The experiments covered the laminar region with a range of Rayleih number of (1.11x106 ≤RaLo≤4.39x106 ), at Prandtle number (Pr=0.7). Practical experiments have been done with variable inclination angles from horizon (Ф=0o ,45o,90o,135oand 180o),facing upward (0o≤Ф<90o), and downward (90o
≤Ф<180o). The results showed that the temperature gradient increases whi
Abstract The wavelet shrink estimator is an attractive technique when estimating the nonparametric regression functions, but it is very sensitive in the case of a correlation in errors. In this research, a polynomial model of low degree was used for the purpose of addressing the boundary problem in the wavelet reduction in addition to using flexible threshold values in the case of Correlation in errors as it deals with those transactions at each level separately, unlike the comprehensive threshold values that deal with all levels simultaneously, as (Visushrink) methods, (False Discovery Rate) method, (Improvement Thresholding) and (Sureshrink method), as the study was conducted on real monthly data represented in the rates of theft crimes f
... Show MoreAbstract
The Non - Homogeneous Poisson process is considered as one of the statistical subjects which had an importance in other sciences and a large application in different areas as waiting raws and rectifiable systems method , computer and communication systems and the theory of reliability and many other, also it used in modeling the phenomenon that occurred by unfixed way over time (all events that changed by time).
This research deals with some of the basic concepts that are related to the Non - Homogeneous Poisson process , This research carried out two models of the Non - Homogeneous Poisson process which are the power law model , and Musa –okumto , to estimate th
... Show MoreThe paper shows how to estimate the three parameters of the generalized exponential Rayleigh distribution by utilizing the three estimation methods, namely, the moment employing estimation method (MEM), ordinary least squares estimation method (OLSEM), and maximum entropy estimation method (MEEM). The simulation technique is used for all these estimation methods to find the parameters for the generalized exponential Rayleigh distribution. In order to find the best method, we use the mean squares error criterion. Finally, in order to extract the experimental results, one of object oriented programming languages visual basic. net was used
Some experiments need to know the extent of their usefulness to continue providing them or not. This is done through the fuzzy regression discontinuous model, where the Epanechnikov Kernel and Triangular Kernel were used to estimate the model by generating data from the Monte Carlo experiment and comparing the results obtained. It was found that the. Epanechnikov Kernel has a least mean squared error.
The significance of the work is to introduce the new class of open sets, which is said Ǥ- -open set with some of properties. Then clarify how to calculate the boundary area for these sets using the upper and lower approximation and obtain the best accuracy.
With the revolutionized expansion of the Internet, worldwide information increases the application of communication technology, and the rapid growth of significant data volume boosts the requirement to accomplish secure, robust, and confident techniques using various effective algorithms. Lots of algorithms and techniques are available for data security. This paper presents a cryptosystem that combines several Substitution Cipher Algorithms along with the Circular queue data structure. The two different substitution techniques are; Homophonic Substitution Cipher and Polyalphabetic Substitution Cipher in which they merged in a single circular queue with four different keys for each of them, which produces eight different outputs for
... Show More