Ferritin is a key organizer of protected deregulation, particularly below risky hyperferritinemia, by straight immune-suppressive and pro-inflammatory things. , We conclude that there is a significant association between levels of ferritin and the harshness of COVID-19. In this paper we introduce a semi- parametric method for prediction by making a combination between NN and regression models. So, two methodologies are adopted, Neural Network (NN) and regression model in design the model; the data were collected from مستشفى دار التمريض الخاص for period 11/7/2021- 23/7/2021, we have 100 person, With COVID 12 Female & 38 Male out of 50, while 26 Female & 24 Male non COVID out of 50. The input variables of the NN model are identified as the ferritin and a gender variable. The higher results precision was attained by the multilayer perceptron (MLP) networks when we applied the explanatory variables as the inputs with one hidden layer, which covers 3 neurons, as the planned many hidden layers are with one output of the fitting NN model which is use in stages of training and validation beside the actual data. We used a portion of the actual data to verify the behaviour of the developed models, we find that only one observation is false prediction value. This mean that the estimation model has significant parameters to forecast the type of Covid cases (Covid or no Covid) .
The logistic regression model regarded as the important regression Models ,where of the most interesting subjects in recent studies due to taking character more advanced in the process of statistical analysis .
The ordinary estimating methods is failed in dealing with data that consist of the presence of outlier values and hence on the absence of such that have undesirable effect on the result. &nbs
... Show MoreIn this paper reliable computational methods (RCMs) based on the monomial stan-dard polynomials have been executed to solve the problem of Jeffery-Hamel flow (JHF). In addition, convenient base functions, namely Bernoulli, Euler and Laguerre polynomials, have been used to enhance the reliability of the computational methods. Using such functions turns the problem into a set of solvable nonlinear algebraic system that MathematicaⓇ12 can solve. The JHF problem has been solved with the help of Improved Reliable Computational Methods (I-RCMs), and a review of the methods has been given. Also, published facts are used to make comparisons. As further evidence of the accuracy and dependability of the proposed methods, the maximum error remainder
... Show MoreThe problem of Multicollinearity is one of the most common problems, which deal to a large extent with the internal correlation between explanatory variables. This problem is especially Appear in economics and applied research, The problem of Multicollinearity has a negative effect on the regression model, such as oversized variance degree and estimation of parameters that are unstable when we use the Least Square Method ( OLS), Therefore, other methods were used to estimate the parameters of the negative binomial model, including the estimated Ridge Regression Method and the Liu type estimator, The negative binomial regression model is a nonline
... Show MoreIn this work, the possibility to use new suggested carriers (D= Aspirin, Ibuprofen, Paracetamol, Tramal) is discussed for diclofenac drug (voltarine) by using quantum mechanics calculations. The calculation methods (PM3) and (DFT) have been used for determination the reaction path of (O-D) bond rupture energies. Different groups of drugs as a carrier for diclofenac prodrugs (in a vacuum) have been used; at their optimized geometries. The calculations included the geometrical structure and some of the physical properties, in addition to the toxicity, biological activity, and NLO properties of the prodrugs, investigated using HF method. The calculations were done by Gaussian 09 program. The comparison was made for total energies of reactan
... Show MoreIn this research, we dealt with the study of the Non-Homogeneous Poisson process, which is one of the most important statistical issues that have a role in scientific development as it is related to accidents that occur in reality, which are modeled according to Poisson’s operations, because the occurrence of this accident is related to time, whether with the change of time or its stability. In our research, this clarifies the Non-Homogeneous hemispheric process and the use of one of these models of processes, which is an exponentiated - Weibull model that contains three parameters (α, β, σ) as a function to estimate the time rate of occurrence of earthquakes in Erbil Governorate, as the governorate is adjacent to two countr
... Show MoreEstimating multivariate location and scatter with both affine equivariance and positive break down has always been difficult. Awell-known estimator which satisfies both properties is the Minimum volume Ellipsoid Estimator (MVE) Computing the exact (MVE) is often not feasible, so one usually resorts to an approximate Algorithm. In the regression setup, algorithm for positive-break down estimators like Least Median of squares typically recomputed the intercept at each step, to improve the result. This approach is called intercept adjustment. In this paper we show that a similar technique, called location adjustment, Can be applied to the (MVE). For this purpose we use the Minimum Volume Ball (MVB). In order
... Show MoreThe main work of this paper is devoted to a new technique of constructing approximated solutions for linear delay differential equations using the basis functions power series functions with the aid of Weighted residual methods (collocations method, Galerkin’s method and least square method).
In order to obtain a mixed model with high significance and accurate alertness, it is necessary to search for the method that performs the task of selecting the most important variables to be included in the model, especially when the data under study suffers from the problem of multicollinearity as well as the problem of high dimensions. The research aims to compare some methods of choosing the explanatory variables and the estimation of the parameters of the regression model, which are Bayesian Ridge Regression (unbiased) and the adaptive Lasso regression model, using simulation. MSE was used to compare the methods.
A simulation study is used to examine the robustness of some estimators on a multiple linear regression model with problems of multicollinearity and non-normal errors, the Ordinary least Squares (LS) ,Ridge Regression, Ridge Least Absolute Value (RLAV), Weighted Ridge (WRID), MM and a robust ridge regression estimator MM estimator, which denoted as RMM this is the modification of the Ridge regression by incorporating robust MM estimator . finialy, we show that RMM is the best among the other estimators
The map of permeability distribution in the reservoirs is considered one of the most essential steps of the geologic model building due to its governing the fluid flow through the reservoir which makes it the most influential parameter on the history matching than other parameters. For that, it is the most petrophysical properties that are tuned during the history matching. Unfortunately, the prediction of the relationship between static petrophysics (porosity) and dynamic petrophysics (permeability) from conventional wells logs has a sophisticated problem to solve by conventional statistical methods for heterogeneous formations. For that, this paper examines the ability and performance of the artificial intelligence method in perme
... Show More