In this research، a comparison has been made between the robust estimators of (M) for the Cubic Smoothing Splines technique، to avoid the problem of abnormality in data or contamination of error، and the traditional estimation method of Cubic Smoothing Splines technique by using two criteria of differentiation which are (MADE، WASE) for different sample sizes and disparity levels to estimate the chronologically different coefficients functions for the balanced longitudinal data which are characterized by observations obtained through (n) from the independent subjects، each one of them is measured repeatedly by group of specific time points (m)،since the frequent measurements within the subjects are almost connected and independent among the different subjects
The principal components analysis is used in analyzing many economic and social phenomena; and one of them is related to a large group in our society who are the university instructors. This phenomenon is the delay occurred in getting university instructor to his next scientific title. And as the determination of the principal components number inside the principal components depends on using many methods, we have compared between three of these methods that are: (BARTLETT, SCREE DIAGRAM, JOLLIFFE).
We concluded that JOLLIFFE method was the best one in analyzing the studying phenomenon data among these three methods, we found the most distinguishing factors effecting on t
... Show MoreIn latest decades, genetic methods have developed into a potent tool in a number of life-attaching applications. In research looking at demographic genetic diversity, QTL detection, marker-assisted selection, and food traceability, DNA-based technologies like PCR are being employed more and more. These approaches call for extraction procedures that provide efficient nucleic acid extraction and the elimination of PCR inhibitors. The first and most important stage in molecular biology is the extraction of DNA from cells. For a molecular scientist, the high quality and integrity of the isolated DNA as well as the extraction method's ease of use and affordability are crucial factors. The present study was designed to establish a simple, fast
... Show MoreA skip list data structure is really just a simulation of a binary search tree. Skip lists algorithm are simpler, faster and use less space. this data structure conceptually uses parallel sorted linked lists. Searching in a skip list is more difficult than searching in a regular sorted linked list. Because a skip list is a two dimensional data structure, it is implemented using a two dimensional network of nodes with four pointers. the implementation of the search, insert and delete operation taking a time of upto . The skip list could be modified to implement the order statistic operations of RANKand SEARCH BY RANK while maintaining the same expected time. Keywords:skip list , parallel linked list , randomized algorithm , rank.
Impact Resistance training with and against the trajectory of the motor in some physical abilities and the BioA 100-meter, mechanical racing run for young people. That Training Jogging for different distances Melt -Rubber ropes According to direction and reversed movement With Obligations To the border of scientific of components Pregnancy Training represents to a training trend Aimed To Events Developments In The link between Starting and running, According to the specific mechanical requirements Have It Of Development of force Explosive and quick and their components which To give Border To the level Special speed for Stages Sprint run 100 m and amounts Efforts Required instantaneous powers. Noted Researcher In That Over there Repeat For
... Show MoreIn this paper, the human robotic leg which can be represented mathematically by single input-single output (SISO) nonlinear differential model with one degree of freedom, is analyzed and then a simple hybrid neural fuzzy controller is designed to improve the performance of this human robotic leg model. This controller consists from SISO fuzzy proportional derivative (FPD) controller with nine rules summing with single node neural integral derivative (NID) controller with nonlinear function. The Matlab simulation results for nonlinear robotic leg model with the suggested controller showed that the efficiency of this controller when compared with the results of the leg model that is controlled by PI+2D, PD+NID, and F
... Show MoreThis research introduce a study with application on Principal Component Regression obtained from some of the explainatory variables to limitate Multicollinearity problem among these variables and gain staibilty in their estimations more than those which yield from Ordinary Least Squares. But the cost that we pay in the other hand losing a little power of the estimation of the predictive regression function in explaining the essential variations. A suggested numerical formula has been proposed and applied by the researchers as optimal solution, and vererifing the its efficiency by a program written by the researchers themselves for this porpuse through some creterions: Cumulative Percentage Variance, Coefficient of Determination, Variance
... Show Moreorder to increase the level of security, as this system encrypts the secret image before sending it through the internet to the recipient (by the Blowfish method). As The Blowfish method is known for its efficient security; nevertheless, the encrypting time is long. In this research we try to apply the smoothing filter on the secret image which decreases its size and consequently the encrypting and decrypting time are decreased. The secret image is hidden after encrypting it into another image called the cover image, by the use of one of these two methods" Two-LSB" or" Hiding most bits in blue pixels". Eventually we compare the results of the two methods to determine which one is better to be used according to the PSNR measurs
The research aims to explain the role of huge data analyzes in measuring quality costs in the Iraqi company for the production of seed, and the research problem was diagnosed with the weakness of the approved method to measure quality costs, and the weak traditional systems of data analyzes, the researcher in the theoretical aspect relied on collecting sources and previous studies, as well as Adoption of the applied analytical approach in the practical aspect, as a set of financial analyzes were applied within the measurement of quality costs and a statement of the role of data analyzes in the practical side, the research concluded to a set of conc
... Show MoreConsider the (p,q) simple connected graph . The sum absolute values of the spectrum of quotient matrix of a graph make up the graph's quotient energy. The objective of this study is to examine the quotient energy of identity graphs and zero-divisor graphs of commutative rings using group theory, graph theory, and applications. In this study, the identity graphs derived from the group and a few classes of zero-divisor graphs of the commutative ring R are examined.