In this research, the nonparametric technique has been presented to estimate the time-varying coefficients functions for the longitudinal balanced data that characterized by observations obtained through (n) from the independent subjects, each one of them is measured repeatedly by group of specific time points (m). Although the measurements are independent among the different subjects; they are mostly connected within each subject and the applied techniques is the Local Linear kernel LLPK technique. To avoid the problems of dimensionality, and thick computation, the two-steps method has been used to estimate the coefficients functions by using the two former technique. Since, the two-steps method depends, in estimation, on (OLS) method, which is sensitive for the existence of abnormality in data or contamination of error; robust methods have been proposed such as LAD & M to strengthen the two-steps method towards the abnormality and contamination of error. In this research imitating experiments have been performed, with verifying the performance of the traditional and robust methods for Local Linear kernel LLPK technique by using two criteria, for different sample sizes and disparity levels.
Through recent years many researchers have developed methods to estimate the self-similarity and long memory parameter that is best known as the Hurst parameter. In this paper, we set a comparison between nine different methods. Most of them use the deviations slope to find an estimate for the Hurst parameter like Rescaled range (R/S), Aggregate Variance (AV), and Absolute moments (AM), and some depend on filtration technique like Discrete Variations (DV), Variance versus level using wavelets (VVL) and Second-order discrete derivative using wavelets (SODDW) were the comparison set by a simulation study to find the most efficient method through MASE. The results of simulation experiments were shown that the performance of the meth
... Show MoreAerial Robot Arms (ARAs) enable aerial drones to interact and influence objects in various environments. Traditional ARA controllers need the availability of a high-precision model to avoid high control chattering. Furthermore, in practical applications of aerial object manipulation, the payloads that ARAs can handle vary, depending on the nature of the task. The high uncertainties due to modeling errors and an unknown payload are inversely proportional to the stability of ARAs. To address the issue of stability, a new adaptive robust controller, based on the Radial Basis Function (RBF) neural network, is proposed. A three-tier approach is also followed. Firstly, a detailed new model for the ARA is derived using the Lagrange–d’A
... Show MoreWith the revolutionized expansion of the Internet, worldwide information increases the application of communication technology, and the rapid growth of significant data volume boosts the requirement to accomplish secure, robust, and confident techniques using various effective algorithms. Lots of algorithms and techniques are available for data security. This paper presents a cryptosystem that combines several Substitution Cipher Algorithms along with the Circular queue data structure. The two different substitution techniques are; Homophonic Substitution Cipher and Polyalphabetic Substitution Cipher in which they merged in a single circular queue with four different keys for each of them, which produces eight different outputs for
... Show MoreCryptography is the process of transforming message to avoid an unauthorized access of data. One of the main problems and an important part in cryptography with secret key algorithms is key. For higher level of secure communication key plays an important role. For increasing the level of security in any communication, both parties must have a copy of the secret key which, unfortunately, is not that easy to achieve. Triple Data Encryption Standard algorithm is weak due to its weak key generation, so that key must be reconfigured to make this algorithm more secure, effective, and strong. Encryption key enhances the Triple Data Encryption Standard algorithm securities. This paper proposed a combination of two efficient encryption algorithms to
... Show MoreIn this paper, the Monte-Carlo simulation method was used to compare the robust circular S estimator with the circular Least squares method in the case of no outlier data and in the case of the presence of an outlier in the data through two trends, the first is contaminant with high inflection points that represents contaminant in the circular independent variable, and the second the contaminant in the vertical variable that represents the circular dependent variable using three comparison criteria, the median standard error (Median SE), the median of the mean squares of error (Median MSE), and the median of the mean cosines of the circular residuals (Median A(k)). It was concluded that the method of least squares is better than the
... Show MoreCollagen triple helix repeat containing-1 (CTHRC1) is an essential marker for Rheumatoid Arthritis (RA), but its relationship with pro-inflammatory, anti-inflammatory, and inflammatory markers has been scantily covered in extant literature. To evaluate the level of CTHRC1 protein in the sera of 100 RA patients and 25 control and compare levels of tumour necrosis factor alpha (TNF-α), interleukin 10 (IL-10), RA disease activity (DAS28), and inflammatory factors. Higher significant serum levels of CTHRC1 (29.367 ng/ml), TNF-α (63.488 pg/ml), and IL-10 (67.1 pg/ml) were found in patient sera as compared to that in control sera (CTHRC1 = 15.732 ng/ml, TNF-α = 33.788 pg/ml, and IL-10 = 25.122 pg/ml). There was no significant correlation be
... Show MoreDatabase is characterized as an arrangement of data that is sorted out and disseminated in a way that allows the client to get to the data being put away in a simple and more helpful way. However, in the era of big-data the traditional methods of data analytics may not be able to manage and process the large amount of data. In order to develop an efficient way of handling big-data, this work studies the use of Map-Reduce technique to handle big-data distributed on the cloud. This approach was evaluated using Hadoop server and applied on EEG Big-data as a case study. The proposed approach showed clear enhancement for managing and processing the EEG Big-data with average of 50% reduction on response time. The obtained results provide EEG r
... Show MoreCurrent research sought to evaluate the performance and results of employees in accordance with normative to people and people result for the European model of excellence EFQM 2013 quality management Foundation in the Inspector General's Office/Ministry of health, so as to pursue a modern and advanced management methods in evaluating performance and the performance of the Office's relationship with a citizen's life, since it takes him beyond the accepted service capabilities today, but it became budget duties between dealers servicing responsibilities and future planning, financial control, competitiveness, human resources needs and maintaining quality and continuous improvement and development as well as The primary role of the
... Show More