In this paper, the Monte-Carlo simulation method was used to compare the robust circular S estimator with the circular Least squares method in the case of no outlier data and in the case of the presence of an outlier in the data through two trends, the first is contaminant with high inflection points that represents contaminant in the circular independent variable, and the second the contaminant in the vertical variable that represents the circular dependent variable using three comparison criteria, the median standard error (Median SE), the median of the mean squares of error (Median MSE), and the median of the mean cosines of the circular residuals (Median A(k)). It was concluded that the method of least squares is better than the methods of the robust circular S method in the case that the data does not contain outlier values because it was recorded the lowest mean criterion, mean squares error (Median MSE), the least median standard error (Median SE) and the largest value of the criterion of the mean cosines of the circular residuals A(K) for all proposed sample sizes (n=20, 50, 100). In the case of the contaminant in the vertical data, it was found that the circular least squares method is not preferred at all contaminant rates and for all sample sizes, and the higher the percentage of contamination in the vertical data, the greater the preference of the validity of estimation methods, where the mean criterion of median squares of error (Median MSE) and criterion of median standard error (Median SE) decrease and the value of the mean criterion of the mean cosines of the circular residuals A(K) increases for all proposed sample sizes. In the case of the contaminant at high lifting points, the circular least squares method is not preferred by a large percentage at all levels of contaminant and for all sample sizes, and the higher the percentage of the contaminant at the lifting points, the greater the preference of the validity estimation methods, so that the mean criterion of mean squares of error (Median MSE) and criterion of median standard error (Median SE) decrease, and the value of the mean criterion increases for the mean cosines of the circular residuals A(K) and for all sample sizes.
The most popular medium that being used by people on the internet nowadays is video streaming. Nevertheless, streaming a video consumes much of the internet traffics. The massive quantity of internet usage goes for video streaming that disburses nearly 70% of the internet. Some constraints of interactive media might be detached; such as augmented bandwidth usage and lateness. The need for real-time transmission of video streaming while live leads to employing of Fog computing technologies which is an intermediary layer between the cloud and end user. The latter technology has been introduced to alleviate those problems by providing high real-time response and computational resources near to the
... Show Morethirty adult NewZealand rabbits used in this study, they were divided in to two groups (control and treaded with Helium — Neon laser). A square skin flap done on the medial aspect of the auricle of both sides, a square piece of cartilage incised, pealed out from each auricle and fixed in the site of the other, then the flaps sutured .The site of the operation in the rabbits of the treated group were irradiated using a Helium —Neon laser with (5mw) power for (10 days) began after the operation directly, (3 rabbits) from each group used for collection of specimens for histopathological examination at the weeks (1,2,3,4, & 6) weeks post the operation .The results revealed Early invasion of the matrix with elastic fibers which continue to t
... Show MoreMany academics have concentrated on applying machine learning to retrieve information from databases to enable researchers to perform better. A difficult issue in prediction models is the selection of practical strategies that yield satisfactory forecast accuracy. Traditional software testing techniques have been extended to testing machine learning systems; however, they are insufficient for the latter because of the diversity of problems that machine learning systems create. Hence, the proposed methodologies were used to predict flight prices. A variety of artificial intelligence algorithms are used to attain the required, such as Bayesian modeling techniques such as Stochastic Gradient Descent (SGD), Adaptive boosting (ADA), Deci
... Show MoreThis paper proposed a theoretical treatment to study underwater wireless optical communications (UWOC) system with different modulation schemes by multiple input-multiple output (MIMO) technology in coastal water. MIMO technology provides high-speed data rates with longer distance link. This technique employed to assess the system by BER, Q. factor and data rate under coastal water types. The reliability of the system is examined by the techniques of 1Tx/1Rx, 2Tx/2Rx, 3Tx/3Rx and 4Tx/4Rx. The results shows the proposed technique by MIMO can get the better performance compared with the other techniques in terms of BER. Theoretical results were obtained to compare between PIN and APD
Many academics have concentrated on applying machine learning to retrieve information from databases to enable researchers to perform better. A difficult issue in prediction models is the selection of practical strategies that yield satisfactory forecast accuracy. Traditional software testing techniques have been extended to testing machine learning systems; however, they are insufficient for the latter because of the diversity of problems that machine learning systems create. Hence, the proposed methodologies were used to predict flight prices. A variety of artificial intelligence algorithms are used to attain the required, such as Bayesian modeling techniques such as Stochastic Gradient Descent (SGD), Adaptive boosting (ADA), Decision Tre
... Show MoreAbstract: Word sense disambiguation (WSD) is a significant field in computational linguistics as it is indispensable for many language understanding applications. Automatic processing of documents is made difficult because of the fact that many of the terms it contain ambiguous. Word Sense Disambiguation (WSD) systems try to solve these ambiguities and find the correct meaning. Genetic algorithms can be active to resolve this problem since they have been effectively applied for many optimization problems. In this paper, genetic algorithms proposed to solve the word sense disambiguation problem that can automatically select the intended meaning of a word in context without any additional resource. The proposed algorithm is evaluated on a col
... Show MoreAs s widely use of exchanging private information in various communication applications, the issue to secure it became top urgent. In this research, a new approach to encrypt text message based on genetic algorithm operators has been proposed. The proposed approach follows a new algorithm of generating 8 bit chromosome to encrypt plain text after selecting randomly crossover point. The resulted child code is flipped by one bit using mutation operation. Two simulations are conducted to evaluate the performance of the proposed approach including execution time of encryption/decryption and throughput computations. Simulations results prove the robustness of the proposed approach to produce better performance for all evaluation metrics with res
... Show MoreAbstract
The current research aims to examine the effect of the Adi and Shayer model on the achievement of fifth-grade students and their attitudes toward history. To achieve the research objective, the researcher has adopted two null hypotheses. 1) there is no statistically significant difference at the level of (0.05) between the average score of students of the experimental group who study the history of Europe and modern American history according to the model of Addie and Shayer, and the average scores of the students of the control group who study the same subjects according to the traditional method in the test of post-achievement. 2) There was no statistically significant difference at the level (
... Show MoreRecently, wireless communication environments with high speeds and low complexity have become increasingly essential. Free-space optics (FSO) has emerged as a promising solution for providing direct connections between devices in such high-spectrum wireless setups. However, FSO communications are susceptible to weather-induced signal fluctuations, leading to fading and signal weakness at the receiver. To mitigate the effects of these challenges, several mathematical models have been proposed to describe the transition from weak to strong atmospheric turbulence, including Rayleigh, lognormal, Málaga, Nakagami-m, K-distribution, Weibull, Negative-Exponential, Inverse-Gaussian, G-G, and Fisher-Snedecor F distributions. This paper extensive
... Show More