In this research, the nonparametric technique has been presented to estimate the time-varying coefficients functions for the longitudinal balanced data that characterized by observations obtained through (n) from the independent subjects, each one of them is measured repeatedly by group of specific time points (m). Although the measurements are independent among the different subjects; they are mostly connected within each subject and the applied techniques is the Local Linear kernel LLPK technique. To avoid the problems of dimensionality, and thick computation, the two-steps method has been used to estimate the coefficients functions by using the two former technique. Since, the two-steps method depends, in estimation, on (OLS) method, which is sensitive for the existence of abnormality in data or contamination of error; robust methods have been proposed such as LAD & M to strengthen the two-steps method towards the abnormality and contamination of error. In this research imitating experiments have been performed, with verifying the performance of the traditional and robust methods for Local Linear kernel LLPK technique by using two criteria, for different sample sizes and disparity levels.
Through recent years many researchers have developed methods to estimate the self-similarity and long memory parameter that is best known as the Hurst parameter. In this paper, we set a comparison between nine different methods. Most of them use the deviations slope to find an estimate for the Hurst parameter like Rescaled range (R/S), Aggregate Variance (AV), and Absolute moments (AM), and some depend on filtration technique like Discrete Variations (DV), Variance versus level using wavelets (VVL) and Second-order discrete derivative using wavelets (SODDW) were the comparison set by a simulation study to find the most efficient method through MASE. The results of simulation experiments were shown that the performance of the meth
... Show MoreIn this paper, the Monte-Carlo simulation method was used to compare the robust circular S estimator with the circular Least squares method in the case of no outlier data and in the case of the presence of an outlier in the data through two trends, the first is contaminant with high inflection points that represents contaminant in the circular independent variable, and the second the contaminant in the vertical variable that represents the circular dependent variable using three comparison criteria, the median standard error (Median SE), the median of the mean squares of error (Median MSE), and the median of the mean cosines of the circular residuals (Median A(k)). It was concluded that the method of least squares is better than the
... Show MoreWith the revolutionized expansion of the Internet, worldwide information increases the application of communication technology, and the rapid growth of significant data volume boosts the requirement to accomplish secure, robust, and confident techniques using various effective algorithms. Lots of algorithms and techniques are available for data security. This paper presents a cryptosystem that combines several Substitution Cipher Algorithms along with the Circular queue data structure. The two different substitution techniques are; Homophonic Substitution Cipher and Polyalphabetic Substitution Cipher in which they merged in a single circular queue with four different keys for each of them, which produces eight different outputs for
... Show MoreCollagen triple helix repeat containing-1 (CTHRC1) is an essential marker for Rheumatoid Arthritis (RA), but its relationship with pro-inflammatory, anti-inflammatory, and inflammatory markers has been scantily covered in extant literature. To evaluate the level of CTHRC1 protein in the sera of 100 RA patients and 25 control and compare levels of tumour necrosis factor alpha (TNF-α), interleukin 10 (IL-10), RA disease activity (DAS28), and inflammatory factors. Higher significant serum levels of CTHRC1 (29.367 ng/ml), TNF-α (63.488 pg/ml), and IL-10 (67.1 pg/ml) were found in patient sera as compared to that in control sera (CTHRC1 = 15.732 ng/ml, TNF-α = 33.788 pg/ml, and IL-10 = 25.122 pg/ml). There was no significant correlation be
... Show MoreCryptography is the process of transforming message to avoid an unauthorized access of data. One of the main problems and an important part in cryptography with secret key algorithms is key. For higher level of secure communication key plays an important role. For increasing the level of security in any communication, both parties must have a copy of the secret key which, unfortunately, is not that easy to achieve. Triple Data Encryption Standard algorithm is weak due to its weak key generation, so that key must be reconfigured to make this algorithm more secure, effective, and strong. Encryption key enhances the Triple Data Encryption Standard algorithm securities. This paper proposed a combination of two efficient encryption algorithms to
... Show MoreCurrent research sought to evaluate the performance and results of employees in accordance with normative to people and people result for the European model of excellence EFQM 2013 quality management Foundation in the Inspector General's Office/Ministry of health, so as to pursue a modern and advanced management methods in evaluating performance and the performance of the Office's relationship with a citizen's life, since it takes him beyond the accepted service capabilities today, but it became budget duties between dealers servicing responsibilities and future planning, financial control, competitiveness, human resources needs and maintaining quality and continuous improvement and development as well as The primary role of the
... Show MoreAbstract
The grey system model GM(1,1) is the model of the prediction of the time series and the basis of the grey theory. This research presents the methods for estimating parameters of the grey model GM(1,1) is the accumulative method (ACC), the exponential method (EXP), modified exponential method (Mod EXP) and the Particle Swarm Optimization method (PSO). These methods were compared based on the Mean square error (MSE) and the Mean Absolute percentage error (MAPE) as a basis comparator and the simulation method was adopted for the best of the four methods, The best method was obtained and then applied to real data. This data represents the consumption rate of two types of oils a he
... Show MoreConstruction contractors usually undertake multiple construction projects simultaneously. Such a situation involves sharing different types of resources, including monetary, equipment, and manpower, which may become a major challenge in many cases. In this study, the financial aspects of working on multiple projects at a time are addressed and investigated. The study considers dealing with financial shortages by proposing a multi-project scheduling optimization model for profit maximization, while minimizing the total project duration. Optimization genetic algorithm and finance-based scheduling are used to produce feasible schedules that balance the finance of activities at any time w