The Internet is providing vital communications between millions of individuals. It is also more and more utilized as one of the commerce tools; thus, security is of high importance for securing communications and protecting vital information. Cryptography algorithms are essential in the field of security. Brute force attacks are the major Data Encryption Standard attacks. This is the main reason that warranted the need to use the improved structure of the Data Encryption Standard algorithm. This paper proposes a new, improved structure for Data Encryption Standard to make it secure and immune to attacks. The improved structure of Data Encryption Standard was accomplished using standard Data Encryption Standard with a new way of two key generations. This means the key generation system generates two keys: one is simple, and the other one is encrypted by using an improved Caesar algorithm. The encryption algorithm in the first 8 round uses simple key 1, and from round 9 to round 16, the algorithm uses encrypted key 2. Using the improved structure of the Data Encryption Standard algorithm, the results of this paper increase Data Encryption Standard encryption security, performance, and complexity of search compared with standard Data Encryption Standard. This means the Differential cryptanalysis cannot be performed on the cipher-text.
Twitter data analysis is an emerging field of research that utilizes data collected from Twitter to address many issues such as disaster response, sentiment analysis, and demographic studies. The success of data analysis relies on collecting accurate and representative data of the studied group or phenomena to get the best results. Various twitter analysis applications rely on collecting the locations of the users sending the tweets, but this information is not always available. There are several attempts at estimating location based aspects of a tweet. However, there is a lack of attempts on investigating the data collection methods that are focused on location. In this paper, we investigate the two methods for obtaining location-based dat
... Show MoreUtilizing the Turbo C programming language, the atmospheric earth model is created from sea level to 86 km. This model has been used to determine atmospheric Earth parameters in this study. Analytical derivations of these parameters are made using the balancing forces theory and the hydrostatic equation. The effects of altitude on density, pressure, temperature, gravitational acceleration, sound speed, scale height, and molecular weight are examined. The mass of the atmosphere is equal to about 50% between sea level and 5.5 km. g is equal to 9.65 m/s2 at 50 km altitude, which is 9% lower than 9.8 m/s2 at sea level. However, at 86 km altitude, g is close to 9.51 m/s2, which is close to 15% smaller than 9.8 m/s2. These resu
... Show MoreThe right of the patient to know the medical risks surrounding the medical intervention is one of the most prominent rights based on the principle of "physical safety", which has undergone several stages of development until it reached the development of the patient's independence in making medical decision without relying on the doctor, The patient's prior informed consent is informed of his / her medical condition. We will study this development in accordance with the French March 4, 2002 legislation on the rights of patients in the health system, whether it was earlier and later. We will highlight the development of the patient's right to "know the medical risks surrounding medical intervention" The legislation and its comparison with th
... Show MoreProtecting information sent through insecure internet channels is a significant challenge facing researchers. In this paper, we present a novel method for image data encryption that combines chaotic maps with linear feedback shift registers in two stages. In the first stage, the image is divided into two parts. Then, the locations of the pixels of each part are redistributed through the random numbers key, which is generated using linear feedback shift registers. The second stage includes segmenting the image into the three primary colors red, green, and blue (RGB); then, the data for each color is encrypted through one of three keys that are generated using three-dimensional chaotic maps. Many statistical tests (entropy, peak signa
... Show More The most likely fusion reaction to be practical is Deuterium and Helium-3 (ð·âˆ’ð»ð‘’
3 ), which is highly desirable because both Helium -3 and Deuterium are stable and the reaction produces a 14 ð‘€ð‘’𑉠proton instead of a neutron and the proton can be shielded by magnetic fields. The strongly dependency of the basically hot plasma parameters such as reactivity, reaction rate, and energy for the emitted protons, upon the total cross section, make the problems for choosing the desirable formula for the cross section, the main goal for our present work.
Uncompressed form of the digital images are needed a very large storage capacity amount, as a consequence requires large communication bandwidth for data transmission over the network. Image compression techniques not only minimize the image storage space but also preserve the quality of image. This paper reveal image compression technique which uses distinct image coding scheme based on wavelet transform that combined effective types of compression algorithms for further compression. EZW and SPIHT algorithms are types of significant compression techniques that obtainable for lossy image compression algorithms. The EZW coding is a worthwhile and simple efficient algorithm. SPIHT is an most powerful technique that utilize for image
... Show MoreHigh peak to average power ration (PAPR) in orthogonal frequency division multiplexing (OFDM) is an important problem, which increase the cost and complexity of high power amplifiers. One of the techniques used to reduce the PAPR in OFDM system is the tone reservation method (TR). In our work we propose a modified tone reservation method to decrease the PAPR with low complexity compared with the conventional TR method by process the high and low amplitudes at the same time. An image of size 128×128 is used as a source of data that transmitted using OFDM system. The proposed method decrease the PAPR by 2dB compared with conventional method with keeping the performance unchanged. The performance of the proposed method is tested with
... Show MoreThere are many methods of searching large amount of data to find one particular piece of information. Such as find name of person in record of mobile. Certain methods of organizing data make the search process more efficient the objective of these methods is to find the element with least cost (least time). Binary search algorithm is faster than sequential and other commonly used search algorithms. This research develops binary search algorithm by using new structure called Triple, structure in this structure data are represented as triple. It consists of three locations (1-Top, 2-Left, and 3-Right) Binary search algorithm divide the search interval in half, this process makes the maximum number of comparisons (Average case com
... Show More