In this paper, the Monte-Carlo simulation method was used to compare the robust circular S estimator with the circular Least squares method in the case of no outlier data and in the case of the presence of an outlier in the data through two trends, the first is contaminant with high inflection points that represents contaminant in the circular independent variable, and the second the contaminant in the vertical variable that represents the circular dependent variable using three comparison criteria, the median standard error (Median SE), the median of the mean squares of error (Median MSE), and the median of the mean cosines of the circular residuals (Median A(k)). It was concluded that the method of least squares is better than the methods of the robust circular S method in the case that the data does not contain outlier values because it was recorded the lowest mean criterion, mean squares error (Median MSE), the least median standard error (Median SE) and the largest value of the criterion of the mean cosines of the circular residuals A(K) for all proposed sample sizes (n=20, 50, 100). In the case of the contaminant in the vertical data, it was found that the circular least squares method is not preferred at all contaminant rates and for all sample sizes, and the higher the percentage of contamination in the vertical data, the greater the preference of the validity of estimation methods, where the mean criterion of median squares of error (Median MSE) and criterion of median standard error (Median SE) decrease and the value of the mean criterion of the mean cosines of the circular residuals A(K) increases for all proposed sample sizes. In the case of the contaminant at high lifting points, the circular least squares method is not preferred by a large percentage at all levels of contaminant and for all sample sizes, and the higher the percentage of the contaminant at the lifting points, the greater the preference of the validity estimation methods, so that the mean criterion of mean squares of error (Median MSE) and criterion of median standard error (Median SE) decrease, and the value of the mean criterion increases for the mean cosines of the circular residuals A(K) and for all sample sizes.
Using the Internet, nothing is secure and as we are in need of means of protecting our data, the use of passwords has become important in the electronic world. To ensure that there is no hacking and to protect the database that contains important information such as the ID card and banking information, the proposed system stores the username after hashing it using the 256 hash algorithm and strong passwords are saved to repel attackers using one of two methods: -The first method is to add a random salt to the password using the CSPRNG algorithm, then hash it using hash 256 and store it on the website. -The second method is to use the PBKDF2 algorithm, which salts the passwords and extends them (deriving the password) before being ha
... Show MoreWith the rapid development of computers and network technologies, the security of information in the internet becomes compromise and many threats may affect the integrity of such information. Many researches are focused theirs works on providing solution to this threat. Machine learning and data mining are widely used in anomaly-detection schemes to decide whether or not a malicious activity is taking place on a network. In this paper a hierarchical classification for anomaly based intrusion detection system is proposed. Two levels of features selection and classification are used. In the first level, the global feature vector for detection the basic attacks (DoS, U2R, R2L and Probe) is selected. In the second level, four local feature vect
... Show MoreThis article aims to provide a bibliometric analysis of intellectual capital research published in the Scopus database from 1956 to 2020 to trace the development of scientific activities that can pave the way for future studies by shedding light on the gaps in the field. The analysis focuses on 638 intellectual capital-related papers published in the Scopus database over 60 years, drawing upon a bibliometric analysis using VOSviewer. This paper highlights the mainstream of the current research in the intellectual capital field, based on the Scopus database, by presenting a detailed bibliometric analysis of the trend and development of intellectual capital research in the past six decades, including journals, authors, countries, inst
... Show MoreIn this research, the performance of a two kind of membrane was examined to recovering the nutrients (protein and lactose) from the whey produced by the soft cheese industry in the General Company for Food Products inAbo-ghraab.Wheyare treated in two stages, the first including press whey into micron filter made of poly vinylidene difluoride (PVDF) standard plate type 800 kilo dalton, The membrane separates the whey to permeate which represent is the main nutrients and to remove the fat and microorganisms.The second stage is to isolate the protein by using ultra filter made of polyethylsulphone(PES)type plate with a measurement of 10,60 kilo dalton and the recovery of lactose in the form of permeate.
The results showed that the percen
This study focused on a fundamental issue which was represented by ability of Iraqi central bank in facing the difficulty of determining the optimal ratio of liquidity in the Iraqi banks in terms of the balancing between its obligations to the depositors and borrowers, and liquidate their funds on one hand and the risks on the other hand.the search aimed for achieving the goals which represented by identifying the possibility of Iraqi banks to apply the regulations rules and instructions issued by central bank of Iraq in determining ratio of liquidity and its appropriate with Iraqi banks action to implement a reasonable profit to&
... Show MoreGray-Scale Image Brightness/Contrast Enhancement with Multi-Model
Histogram linear Contrast Stretching (MMHLCS) method