In this paper, the Monte-Carlo simulation method was used to compare the robust circular S estimator with the circular Least squares method in the case of no outlier data and in the case of the presence of an outlier in the data through two trends, the first is contaminant with high inflection points that represents contaminant in the circular independent variable, and the second the contaminant in the vertical variable that represents the circular dependent variable using three comparison criteria, the median standard error (Median SE), the median of the mean squares of error (Median MSE), and the median of the mean cosines of the circular residuals (Median A(k)). It was concluded that the method of least squares is better than the methods of the robust circular S method in the case that the data does not contain outlier values because it was recorded the lowest mean criterion, mean squares error (Median MSE), the least median standard error (Median SE) and the largest value of the criterion of the mean cosines of the circular residuals A(K) for all proposed sample sizes (n=20, 50, 100). In the case of the contaminant in the vertical data, it was found that the circular least squares method is not preferred at all contaminant rates and for all sample sizes, and the higher the percentage of contamination in the vertical data, the greater the preference of the validity of estimation methods, where the mean criterion of median squares of error (Median MSE) and criterion of median standard error (Median SE) decrease and the value of the mean criterion of the mean cosines of the circular residuals A(K) increases for all proposed sample sizes. In the case of the contaminant at high lifting points, the circular least squares method is not preferred by a large percentage at all levels of contaminant and for all sample sizes, and the higher the percentage of the contaminant at the lifting points, the greater the preference of the validity estimation methods, so that the mean criterion of mean squares of error (Median MSE) and criterion of median standard error (Median SE) decrease, and the value of the mean criterion increases for the mean cosines of the circular residuals A(K) and for all sample sizes.
Nowadays, it is quite usual to transmit data through the internet, making safe online communication essential and transmitting data over internet channels requires maintaining its confidentiality and ensuring the integrity of the transmitted data from unauthorized individuals. The two most common techniques for supplying security are cryptography and steganography. Data is converted from a readable format into an unreadable one using cryptography. Steganography is the technique of hiding sensitive information in digital media including image, audio, and video. In our proposed system, both encryption and hiding techniques will be utilized. This study presents encryption using the S-DES algorithm, which generates a new key in each cyc
... Show Moreفي تعزيز بيئة الانتاج الرشيق استخدام اسلوب S -5
Abstract
The Non - Homogeneous Poisson process is considered as one of the statistical subjects which had an importance in other sciences and a large application in different areas as waiting raws and rectifiable systems method , computer and communication systems and the theory of reliability and many other, also it used in modeling the phenomenon that occurred by unfixed way over time (all events that changed by time).
This research deals with some of the basic concepts that are related to the Non - Homogeneous Poisson process , This research carried out two models of the Non - Homogeneous Poisson process which are the power law model , and Musa –okumto , to estimate th
... Show MoreEncryption of data is translating data to another shape or symbol which enables people only with an access to the secret key or a password that can read it. The data which are encrypted are generally referred to as cipher text, while data which are unencrypted are known plain text. Entropy can be used as a measure which gives the number of bits that are needed for coding the data of an image. As the values of pixel within an image are dispensed through further gray-levels, the entropy increases. The aim of this research is to compare between CAST-128 with proposed adaptive key and RSA encryption methods for video frames to determine the more accurate method with highest entropy. The first method is achieved by applying the "CAST-128" and
... Show MoreIn this paper, double Sumudu and double Elzaki transforms methods are used to compute the numerical solutions for some types of fractional order partial differential equations with constant coefficients and explaining the efficiently of the method by illustrating some numerical examples that are computed by using Mathcad 15.and graphic in Matlab R2015a.
Abstract:
This research aims to identify the impact of the Layout Ghazi al-Hariri hospital for surgery specialist on customer satisfaction (patients) using the model (Servicescape), the problem of the research represented in the extent to which the hospital management design of the service and Layout hospital aesthetic and functional aspects that fit patients for therapeutic and nursing services , and used the developer scale by (Miles et al., 2012) for data collection, which includes the independent variable in (17) items distributed in three dimensions (Facility aesthetics , hospital cleanliness, and the Layout accessibility ) The dependent variable is the satisfaction of customers (pat
... Show MoreIn this work, radius of shock wave of plasma plume (R) and speed of plasma (U) have been calculated theoretically using Matlab program.
The undetected error probability is an important measure to assess the communication reliability provided by any error coding scheme. Two error coding schemes namely, Joint crosstalk avoidance and Triple Error Correction (JTEC) and JTEC with Simultaneous Quadruple Error Detection (JTEC-SQED), provide both crosstalk reduction and multi-bit error correction/detection features. The available undetected error probability model yields an upper bound value which does not give accurate estimation on the reliability provided. This paper presents an improved mathematical model to estimate the undetected error probability of these two joint coding schemes. According to the decoding algorithm the errors are classified into patterns and their decoding
... Show MoreThe study showed flow rates and the interaction between the settlements served by applying the model of gravity theory to measure depending on the number of the population between city Najaf and the rest of the other settlements served and using three functions of disability, time and cost, as recorded an increase in the interaction index with some settlements like them Kufa, Abbasid and Manathira, while the indicator contrast was in other settlements, either when the application of the gravity model depending on trips and socio-economic characteristics accuracy rate was more pronounced.