In this research، a comparison has been made between the robust estimators of (M) for the Cubic Smoothing Splines technique، to avoid the problem of abnormality in data or contamination of error، and the traditional estimation method of Cubic Smoothing Splines technique by using two criteria of differentiation which are (MADE، WASE) for different sample sizes and disparity levels to estimate the chronologically different coefficients functions for the balanced longitudinal data which are characterized by observations obtained through (n) from the independent subjects، each one of them is measured repeatedly by group of specific time points (m)،since the frequent measurements within the subjects are almost connected and independent among the different subjects
Support vector machines (SVMs) are supervised learning models that analyze data for classification or regression. For classification, SVM is widely used by selecting an optimal hyperplane that separates two classes. SVM has very good accuracy and extremally robust comparing with some other classification methods such as logistics linear regression, random forest, k-nearest neighbor and naïve model. However, working with large datasets can cause many problems such as time-consuming and inefficient results. In this paper, the SVM has been modified by using a stochastic Gradient descent process. The modified method, stochastic gradient descent SVM (SGD-SVM), checked by using two simulation datasets. Since the classification of different ca
... Show Morethis research aims it measure the technical efficiency of the branches of the General Company for Land Transport, That scattered geographically at country level, by Data Envelopment analysis (DEA) technique, as this technique relies on measuring the efficiency of a set of asymmetric Decision making units, which is one of the nonparametric mathematical methods for and application related to Linear Programming, and this is what helps the General Company for Land Transport to diagnose its branches performance by benchmarking with each other and determine the performance gap. The research found that there is variation in the level of efficiency in the company's branches
Proxy-based sliding mode control PSMC is an improved version of PID control that combines the features of PID and sliding mode control SMC with continuously dynamic behaviour. However, the stability of the control architecture maybe not well addressed. Consequently, this work is focused on modification of the original version of the proxy-based sliding mode control PSMC by adding an adaptive approximation compensator AAC term for vibration control of an Euler-Bernoulli beam. The role of the AAC term is to compensate for unmodelled dynamics and make the stability proof more easily. The stability of the proposed control algorithm is systematically proved using Lyapunov theory. Multi-modal equation of motion is derived using the Galerkin metho
... Show MoreThe amount of protein in the serum depends on the balance between the rate of its synthesis, and that of its catabolism or loss. Abnormal metabolism may result from nutritional deficiency, enzyme deficiency, abnormal secretion of hormones, or the actions of drugs and toxins. Renal cancer is the third most common malignancy of the genitourinary system, and accounts for 3% of adult malignancies globally. Total serum proteins were measured in malignant kidney tumor, benign kidney tumors, and non tumoral kidney diseases patient groups, as well as in healthy individuals. A significant decrease (p< 0.001) of total serum protein levels in patients with malignant kidney tumors when compared with those of benign tumors, non tumoral diseases, and hea
... Show MoreIn information security, fingerprint verification is one of the most common recent approaches for verifying human identity through a distinctive pattern. The verification process works by comparing a pair of fingerprint templates and identifying the similarity/matching among them. Several research studies have utilized different techniques for the matching process such as fuzzy vault and image filtering approaches. Yet, these approaches are still suffering from the imprecise articulation of the biometrics’ interesting patterns. The emergence of deep learning architectures such as the Convolutional Neural Network (CNN) has been extensively used for image processing and object detection tasks and showed an outstanding performance compare
... Show MoreArtificial Intelligence Algorithms have been used in recent years in many scientific fields. We suggest employing flower pollination algorithm in the environmental field to find the best estimate of the semi-parametric regression function with measurement errors in the explanatory variables and the dependent variable, where measurement errors appear frequently in fields such as chemistry, biological sciences, medicine, and epidemiological studies, rather than an exact measurement. We estimate the regression function of the semi-parametric model by estimating the parametric model and estimating the non-parametric model, the parametric model is estimated by using an instrumental variables method (Wald method, Bartlett’s method, and Durbin
... Show MoreThis paper introduces a non-conventional approach with multi-dimensional random sampling to solve a cocaine abuse model with statistical probability. The mean Latin hypercube finite difference (MLHFD) method is proposed for the first time via hybrid integration of the classical numerical finite difference (FD) formula with Latin hypercube sampling (LHS) technique to create a random distribution for the model parameters which are dependent on time [Formula: see text]. The LHS technique gives advantage to MLHFD method to produce fast variation of the parameters’ values via number of multidimensional simulations (100, 1000 and 5000). The generated Latin hypercube sample which is random or non-deterministic in nature is further integ
... Show More