The modern systems that have been based upon the hash function are more suitable compared to the conventional systems; however, the complicated algorithms for the generation of the invertible functions have a high level of time consumption. With the use of the GAs, the key strength is enhanced, which results in ultimately making the entire algorithm sufficient. Initially, the process of the key generation is performed by using the results of n-queen problem that is solved by the genetic algorithm, with the use of a random number generator and through the application of the GA operations. Ultimately, the encryption of the data is performed with the use of the Modified Reverse Encryption Algorithm (MREA). It was noticed that the suggested algorithm provided more sufficient results concerning the key and the strength of security. However, it has lower computational efficiency as compared to the other algorithms.
Immuno-haematological genetic markers study was carried out to understand the genetic background variations among Kirkuk (Iraq) indigenous population. A cross-sectional study of 179 patients with thalassemia major was conducted in Kirkuk. A detailed review was undertaken to define the relationships between ethnic origins, phenotype and immuno-genetic markers uniformity in relation to genetic isolation and interethnic admixture. A total of 179 thalassemia major patients were subjected to analysis in the hereditary blood diseases centre, including (18(10.05 %)) of intermarriages between different ethnic groups origin, whereas the overall consanguinity marriage rate was estimated at (161 (89.9%)) including (63(35.1%)) for first cousi
... Show MoreThe experimental proton resonance data for the reaction P+48Ti have been used to calculate and evaluate the level density by employed the Gaussian Orthogonal Ensemble, GOE version of RMT, Constant Temperature, CT and Back Shifted Fermi Gas, BSFG models at certain spin-parity and at different proton energies. The results of GOE model are found in agreement with other, while the level density calculated using the BSFG Model showed less values with spin dependence more than parity, due the limitation in the parameters (level density parameter, a, Energy shift parameter, E1and spin cut off parameter, σc). Also, in the CT Model the level density results depend mainly on two parameters (T and ground state back shift energy, E0), which are app
... Show MoreThis paper introduces the Multistep Modified Reduced Differential Transform Method (MMRDTM). It is applied to approximate the solution for Nonlinear Schrodinger Equations (NLSEs) of power law nonlinearity. The proposed method has some advantages. An analytical approximation can be generated in a fast converging series by applying the proposed approach. On top of that, the number of computed terms is also significantly reduced. Compared to the RDTM, the nonlinear term in this method is replaced by related Adomian polynomials prior to the implementation of a multistep approach. As a consequence, only a smaller number of NLSE computed terms are required in the attained approximation. Moreover, the approximation also converges rapidly over a
... Show MoreCloud computing is a pay-as-you-go model that provides users with on-demand access to services or computing resources. It is a challenging issue to maximize the service provider's profit and, on the other hand, meet the Quality of Service (QoS) requirements of users. Therefore, this paper proposes an admission control heuristic (ACH) approach that selects or rejects the requests based on budget, deadline, and penalty cost, i.e., those given by the user. Then a service level agreement (SLA) is created for each selected request. The proposed work uses Particle Swarm Optimization (PSO) and the Salp Swarm Algorithm (SSA) to schedule the selected requests under budget and deadline constraints. Performances of PSO and SSA with and witho
... Show MoreNonlinear regression models are important tools for solving optimization problems. As traditional techniques would fail to reach satisfactory solutions for the parameter estimation problem. Hence, in this paper, the BAT algorithm to estimate the parameters of Nonlinear Regression models is used . The simulation study is considered to investigate the performance of the proposed algorithm with the maximum likelihood (MLE) and Least square (LS) methods. The results show that the Bat algorithm provides accurate estimation and it is satisfactory for the parameter estimation of the nonlinear regression models than MLE and LS methods depend on Mean Square error.
Image compression has become one of the most important applications of the image processing field because of the rapid growth in computer power. The corresponding growth in the multimedia market, and the advent of the World Wide Web, which makes the internet easily accessible for everyone. Since the early 1980, digital image sequence processing has been an attractive research area because an image sequence, as acollection of images, may provide much compression than a single image frame. The increased computational complexity and memory space required for image sequence processing, has in fact, becoming more attainable. this research absolute Moment Block Truncation compression technique which is depend on adopting the good points of oth
... Show MoreNowadays, it is convenient for us to use a search engine to get our needed information. But sometimes it will misunderstand the information because of the different media reports. The Recommender System (RS) is popular to use for every business since it can provide information for users that will attract more revenues for companies. But also, sometimes the system will recommend unneeded information for users. Because of this, this paper provided an architecture of a recommender system that could base on user-oriented preference. This system is called UOP-RS. To make the UOP-RS significantly, this paper focused on movie theatre information and collect the movie database from the IMDb website that provides informatio
... Show MoreIn light of increasing demand for energy consumption due to life complexity and its requirements, which reflected on architecture in type and size, Environmental challenges have emerged in the need to reduce emissions and power consumption within the construction sector. Which urged designers to improve the environmental performance of buildings by adopting new design approaches, Invest digital technology to facilitate design decision-making, in short time, effort and cost. Which doesn’t stop at the limits of acceptable efficiency, but extends to the level of (the highest performance), which doesn’t provide by traditional approaches that adopted by researchers and local institutions in their studies and architectural practices, limit
... Show MoreIn recent years, social media has been increasing widely and obviously as a media for users expressing their emotions and feelings through thousands of posts and comments related to tourism companies. As a consequence, it became difficult for tourists to read all the comments to determine whether these opinions are positive or negative to assess the success of a tourism company. In this paper, a modest model is proposed to assess e-tourism companies using Iraqi dialect reviews collected from Facebook. The reviews are analyzed using text mining techniques for sentiment classification. The generated sentiment words are classified into positive, negative and neutral comments by utilizing Rough Set Theory, Naïve Bayes and K-Nearest Neighbor
... Show MoreThe production of fission products during reactor operation has a very important effect on reactor reactivity .Results of neutron cross section evaluations are presented for the main product nuclides considered as being the most important for reactor calculation and burn-up consideration . Data from the main international libraries considered as containing the most up-to-date nuclear data and the latest experimental measurements are considered in the evaluation processes, we describe the evaluated cross sections of the fission product nuclides by making inter comparison of the data and point out the discrepancies among libraries.