This paper proposes a new encryption method. It combines two cipher algorithms, i.e., DES and AES, to generate hybrid keys. This combination strengthens the proposed W-method by generating high randomized keys. Two points can represent the reliability of any encryption technique. Firstly, is the key generation; therefore, our approach merges 64 bits of DES with 64 bits of AES to produce 128 bits as a root key for all remaining keys that are 15. This complexity increases the level of the ciphering process. Moreover, it shifts the operation one bit only to the right. Secondly is the nature of the encryption process. It includes two keys and mixes one round of DES with one round of AES to reduce the performance time. The W-method deals with Arabic and English texts with the same efficiency. The result showed that the proposed method performs faster and more securely when compared to standard DES and AES algorithms.
Twitter data analysis is an emerging field of research that utilizes data collected from Twitter to address many issues such as disaster response, sentiment analysis, and demographic studies. The success of data analysis relies on collecting accurate and representative data of the studied group or phenomena to get the best results. Various twitter analysis applications rely on collecting the locations of the users sending the tweets, but this information is not always available. There are several attempts at estimating location based aspects of a tweet. However, there is a lack of attempts on investigating the data collection methods that are focused on location. In this paper, we investigate the two methods for obtaining location-based dat
... Show MoreUncompressed form of the digital images are needed a very large storage capacity amount, as a consequence requires large communication bandwidth for data transmission over the network. Image compression techniques not only minimize the image storage space but also preserve the quality of image. This paper reveal image compression technique which uses distinct image coding scheme based on wavelet transform that combined effective types of compression algorithms for further compression. EZW and SPIHT algorithms are types of significant compression techniques that obtainable for lossy image compression algorithms. The EZW coding is a worthwhile and simple efficient algorithm. SPIHT is an most powerful technique that utilize for image
... Show MoreA steganography hides information within other information, such as file, message, picture, or video. A cryptography is the science of converting the information from a readable form to an unreadable form for unauthorized person. The main problem in the stenographic system is embedding in cover-data without providing information that would facilitate its removal. In this research, a method for embedding data into images is suggested which employs least significant bit Steganography (LSB) and ciphering (RSA algorithm) to protect the data. System security will be enhanced by this collaboration between steganography and cryptography.
<span>One of the main difficulties facing the certified documents documentary archiving system is checking the stamps system, but, that stamps may be contains complex background and surrounded by unwanted data. Therefore, the main objective of this paper is to isolate background and to remove noise that may be surrounded stamp. Our proposed method comprises of four phases, firstly, we apply k-means algorithm for clustering stamp image into a number of clusters and merged them using ISODATA algorithm. Secondly, we compute mean and standard deviation for each remaining cluster to isolate background cluster from stamp cluster. Thirdly, a region growing algorithm is applied to segment the image and then choosing the connected regi
... Show MoreAcinetobacter baumannii (A. baumannii ) is considered a critical healthcare problem for patients in intensive care units due to its high ability to be multidrug-resistant to most commercially available antibiotics. The aim of this study is to develop a colorimetric assay to quantitatively detect the target DNA of A. baumannii based on unmodified gold nanoparticles (AuNPs) from different clinical samples (burns, surgical wounds, sputum, blood and urine). A total of thirty-six A. baumannii clinical isolates were collected from five Iraqi hospitals in Erbil and Mosul provinces within the period from September 2020 to January 2021. Bacterial isolation and biochemical identification of isolates
... Show MoreAbstract. Full-waveform airborne laser scanning data has shown its potential to enhance available segmentation and classification approaches through the additional information it can provide. However, this additional information is unable to directly provide a valid physical representation of surface features due to many variables affecting the backscattered energy during travel between the sensor and the target. Effectively, this delivers a mis-match between signals from overlapping flightlines. Therefore direct use of this information is not recommended without the adoption of a comprehensive radiometric calibration strategy that accounts for all these effects. This paper presents a practical and reliable radiometric calibration r
... Show MoreWireless Sensor Networks (WSNs) are promoting the spread of the Internet for devices in all areas of
life, which makes it is a promising technology in the future. In the coming days, as attack technologies become
more improved, security will have an important role in WSN. Currently, quantum computers pose a significant
risk to current encryption technologies that work in tandem with intrusion detection systems because it is
difficult to implement quantum properties on sensors due to the resource limitations. In this paper, quantum
computing is used to develop a future-proof, robust, lightweight and resource-conscious approach to sensor
networks. Great emphasis is placed on the concepts of using the BB8
The focus of this paper is the presentation of a new type of mapping called projection Jungck zn- Suzuki generalized and also defining new algorithms of various types (one-step and two-step algorithms) (projection Jungck-normal N algorithm, projection Jungck-Picard algorithm, projection Jungck-Krasnoselskii algorithm, and projection Jungck-Thianwan algorithm). The convergence of these algorithms has been studied, and it was discovered that they all converge to a fixed point. Furthermore, using the previous three conditions for the lemma, we demonstrated that the difference between any two sequences is zero. These algorithms' stability was demonstrated using projection Jungck Suzuki generalized mapping. In contrast, the rate of convergenc
... Show More