With the revolutionized expansion of the Internet, worldwide information increases the application of communication technology, and the rapid growth of significant data volume boosts the requirement to accomplish secure, robust, and confident techniques using various effective algorithms. Lots of algorithms and techniques are available for data security. This paper presents a cryptosystem that combines several Substitution Cipher Algorithms along with the Circular queue data structure. The two different substitution techniques are; Homophonic Substitution Cipher and Polyalphabetic Substitution Cipher in which they merged in a single circular queue with four different keys for each of them, which produces eight different outputs for every single incoming letter. The present work can be applied efficiently for personal information security and network communication security as well, and the time required for ciphering and deciphering a message is less than 0.1 sec.
In this paper, the Monte-Carlo simulation method was used to compare the robust circular S estimator with the circular Least squares method in the case of no outlier data and in the case of the presence of an outlier in the data through two trends, the first is contaminant with high inflection points that represents contaminant in the circular independent variable, and the second the contaminant in the vertical variable that represents the circular dependent variable using three comparison criteria, the median standard error (Median SE), the median of the mean squares of error (Median MSE), and the median of the mean cosines of the circular residuals (Median A(k)). It was concluded that the method of least squares is better than the
... Show MoreBig data usually running in large-scale and centralized key management systems. However, the centralized key management systems are increasing the problems such as single point of failure, exchanging a secret key over insecure channels, third-party query, and key escrow problem. To avoid these problems, we propose an improved certificate-based encryption scheme that ensures data confidentiality by combining symmetric and asymmetric cryptography schemes. The combination can be implemented by using the Advanced Encryption Standard (AES) and Elliptic Curve Diffie-Hellman (ECDH). The proposed scheme is an enhanced version of the Certificate-Based Encryption (CBE) scheme and preserves all its advantages. However
... Show More
The current research aims to determine the relationship of the impact of the components of the financing structure, especially financing through debts, as well as the earnings per share in the value of the shares of companies listed in the Iraq Stock Exchange. The research sample and identifying the strength of the combined effect of the ratio of financing through debt and earnings per share in maximizing The market value of the firm and the real value, as well as the variation between these relationships according to model of the real value of the companies and the market value of the research sample companies. The research community is represented by the Iraq Stock Exchange, while a conditional deliberate sample
... Show MoreThe Internet is providing vital communications between millions of individuals. It is also more and more utilized as one of the commerce tools; thus, security is of high importance for securing communications and protecting vital information. Cryptography algorithms are essential in the field of security. Brute force attacks are the major Data Encryption Standard attacks. This is the main reason that warranted the need to use the improved structure of the Data Encryption Standard algorithm. This paper proposes a new, improved structure for Data Encryption Standard to make it secure and immune to attacks. The improved structure of Data Encryption Standard was accomplished using standard Data Encryption Standard with a new way of two key gene
... Show MoreThe current study aims to compare between the assessments of the Rush model’s parameters to the missing and completed data in various ways of processing the missing data. To achieve the aim of the present study, the researcher followed the following steps: preparing Philip Carter test for the spatial capacity which consists of (20) items on a group of (250) sixth scientific stage students in the directorates of Baghdad Education at Al–Rusafa (1st, 2nd and 3rd) for the academic year (2018-2019). Then, the researcher relied on a single-parameter model to analyze the data. The researcher used Bilog-mg3 model to check the hypotheses, data and match them with the model. In addition
... Show MoreIs in this research review of the way minimum absolute deviations values based on linear programming method to estimate the parameters of simple linear regression model and give an overview of this model. We were modeling method deviations of the absolute values proposed using a scale of dispersion and composition of a simple linear regression model based on the proposed measure. Object of the work is to find the capabilities of not affected by abnormal values by using numerical method and at the lowest possible recurrence.
Lattakia city faces many problems related to the mismanagement of solid waste, as the disposal process is limited to the random Al-Bassa landfill without treatment. Therefore, solid waste management poses a special challenge to decision-makers by choosing the appropriate tool that supports strategic decisions in choosing municipal solid waste treatment methods and evaluating their management systems. As the human is primarily responsible for the formation of waste, this study aims to measure the degree of environmental awareness in the Lattakia Governorate from the point of view of the research sample members and to discuss the effect of the studied variables (place of residence, educational level, gender, age, and professional status) o
... Show MoreBig data analysis is essential for modern applications in areas such as healthcare, assistive technology, intelligent transportation, environment and climate monitoring. Traditional algorithms in data mining and machine learning do not scale well with data size. Mining and learning from big data need time and memory efficient techniques, albeit the cost of possible loss in accuracy. We have developed a data aggregation structure to summarize data with large number of instances and data generated from multiple data sources. Data are aggregated at multiple resolutions and resolution provides a trade-off between efficiency and accuracy. The structure is built once, updated incrementally, and serves as a common data input for multiple mining an
... Show MoreIn this paper, we devoted to use circular shape sliding block, in image edge determination. The circular blocks have symmetrical properties in all directions for the mask points around the central mask point. Therefore, the introduced method is efficient to be use in detecting image edges, in all directions curved edges, and lines. The results exhibit a very good performance in detecting image edges, comparing with other edge detectors results.
This research aims to focus on the reality of the imbalance of the balance of trade structure in order to improve it and determine the size of the imbalance as a result of dependence on one commodity, namely crude oil in the structure of exports versus the diversity of the structure of imports of various goods and goods.
In order to achieve that goal, a deductive approach was adopted, which included a shift from general theory data to special applications.
We have reached through the research to a number of conclusions, most notably the effectiveness of public spending in correcting the imbalance of the balance of trade structure during the study pe
... Show More