Genetic Algorithms (GA) is a based population approach. It belongs to a metaheuristic procedure that uses population characteristics to guide the search. It maintains and improves multiple solutions which may produce a high-quality solution to an optimization problem. This study presents a comprehensive survey of the GA. We provide and discuss genetic algorithms for new researchers. We illustrate which components build up the GAs and view the main results on complexity time.
Steganography is an important class of security which is widely used in computer and network security nowadays. In this research, a new proposed algorithm was introduced with a new concept of dealing with steganography as an algorithmic secret key technique similar to stream cipher cryptographic system. The proposed algorithm is a secret key system suggested to be used in communications for messages transmission steganography
Vascular patterns were seen to be a probable identification characteristic of the biometric system. Since then, many studies have investigated and proposed different techniques which exploited this feature and used it for the identification and verification purposes. The conventional biometric features like the iris, fingerprints and face recognition have been thoroughly investigated, however, during the past few years, finger vein patterns have been recognized as a reliable biometric feature. This study discusses the application of the vein biometric system. Though the vein pattern can be a very appealing topic of research, there are many challenges in this field and some improvements need to be carried out. Here, the researchers reviewed
... Show MoreThe Twofish cipher is a very powerful algorithm with a fairly complex structure that permeates most data parsing and switching and can be easily implemented. The keys of the Twofish algorithm are of variable length (128, 192, or 256 bits), and the key schedule is generated once and repeated in encrypting all message blocks, whatever their number, and this reduces the confidentiality of encryption. This article discusses the process of generating cipher keys for each block. This concept is new and unknown in all common block cipher algorithms. It is based on the permanent generation of sub keys for all blocks and the key generation process, each according to its work. The Geffe's Generator is used to generate subkeys to make eac
... Show MoreThe study includes collection of data about cholera disease from six health centers from nine locations with 2500km2 and a population of 750000individual. The average of infection for six centers during the 2000-2003 was recorded. There were 3007 cases of diarrhea diagnosed as cholera caused by Vibrio cholerae. The percentage of male infection was 14. 7% while for female were 13. 2%. The percentage of infection for children (less than one year) was 6.1%, it while for the age (1-5 years) was 6.9%and for the ages more than 5 years was 14.5%.The total percentage of the patients stayed in hospital was 7.7%(4.2%for male and 3.4%for female). The bacteria was isolated and identified from 7cases in the Central Laboratory for Health in Baghdad. In
... Show MoreThe great progress in information and communication technology has led to a huge increase in data available. Traditional systems can't keep up with this growth and can't handle this huge amount of data. Recommendation systems are one of the most important areas of research right now because they help people make decisions and find what they want among all this data. This study looked at the research trends published in Google Scholar within the period 2018-2022 related to recommending, reviewing, analysing, and comparing ebooks research papers. At first, the research papers were collected and classified based on the recommendation model used, the year of publication, and then they were compared in terms of techniques, datasets u
... Show MoreA new gravity and seismic survey along a profile length (70 Km) carried out at the area of exploration Block- 11 in Al- Najaf desert, southwest Iraq. The obtained gravity and seismic value compared with the previous available data that achieved by IPC and OEC for gravity and seismic data respectively. The difference between the new gravity profile and the old one is mainly in small anomaly not more than (14 %) that related to shallow depth levels and presents through power spectrum analysis. Previously, Ghulaissan-1 well which drilled in (1960) depend on a positive gravity and magnetic anomaly interpret, which is considered as an anticline structure in sub surface. It did not indicate any hydrocarbon shows after drilling, the integration
... Show MoreBecause of vulnerable threats and attacks against database during transmission from sender to receiver, which is one of the most global security concerns of network users, a lightweight cryptosystem using Rivest Cipher 4 (RC4) algorithm is proposed. This cryptosystem maintains data privacy by performing encryption of data in cipher form and transfers it over the network and again performing decryption to original data. Hens, ciphers represent encapsulating system for database tables
The analysis of the hyperlink structure of the web has led to significant improvements in web information retrieval. This survey study evaluates and analyzes relevant research publications on link analysis in web information retrieval utilizing diverse methods. These factors include the research year, the aims of the research article, the algorithms utilized to complete their study, and the findings received after using the algorithms. The findings revealed that Page Rank, Weighted Page Rank, and Weighted Page Content Rank are extensively employed by academics to properly analyze hyperlinks in web information retrieval. Finally, this paper analyzes the previous studies.
During the two last decades ago, audio compression becomes the topic of many types of research due to the importance of this field which reflecting on the storage capacity and the transmission requirement. The rapid development of the computer industry increases the demand for audio data with high quality and accordingly, there is great importance for the development of audio compression technologies, lossy and lossless are the two categories of compression. This paper aims to review the techniques of the lossy audio compression methods, summarize the importance and the uses of each method.