Recently, digital communication has become a critical necessity and so the Internet has become the most used medium and most efficient for digital communication. At the same time, data transmitted through the Internet are becoming more vulnerable. Therefore, the issue of maintaining secrecy of data is very important, especially if the data is personal or confidential. Steganography has provided a reliable method for solving such problems. Steganography is an effective technique in secret communication in digital worlds where data sharing and transfer is increasing through the Internet, emails and other ways. The main challenges of steganography methods are the undetectability and the imperceptibility of confidential data. This paper presents a steganography method in frequency domain. Haar Wavelet Transform is applied for decomposition of gray level cover image into four sub-bands. The secret image is hidden in the high frequency HH sub-band after applying the histogram modification followed by scrambling process. A Histogram modification is adopted, to scale the secret image to normalize its values, that manipulates the secret image from bright image to dark. Thus the secret image becomes invisible so it can be hidden in the high frequency sub-band. Scrambling the positions can be for rows then columns, which will give strong security of the hiding process. The experimental results demonstrate the proposed method has achieved superior performance in terms of quantifiable measurement (PSNR and correlation) and in terms of visual quality. The proposed method propositions good imperceptible results and good response for against the various image attacks.
This research is focused on an interpretive of 2D seismic data to study is reinterpreting seismic data by applying sufficient software (Petrel 2017) of the area between Al-Razzazah Lake and the Euphrates river belonging to Karbala'a and Al-Anbar Governorates, central Iraq. The delineation of the sub-surface structural features and evaluation of the structure of Najmah and Zubair Formations was done. The structure interpretation showed that the studied area was affected by normal fault bearing (NW-SE) direction with a small displacement. In contrast, time and depth maps showed monocline structures (nose structures) located in the western part of the studied area.
In the last decade, the web has rapidly become an attractive platform, and an indispensable part of our lives. Unfortunately, as our dependency on the web increases so programmers focus more on functionality and appearance than security, has resulted in the interest of attackers in exploiting serious security problems that target web applications and web-based information systems e.g. through an SQL injection attack. SQL injection in simple terms, is the process of passing SQL code into interactive web applications that employ database services such applications accept user input such as form and then include this input in database requests, typically SQL statements in a way that was not intende
... Show MoreData centric techniques, like data aggregation via modified algorithm based on fuzzy clustering algorithm with voronoi diagram which is called modified Voronoi Fuzzy Clustering Algorithm (VFCA) is presented in this paper. In the modified algorithm, the sensed area divided into number of voronoi cells by applying voronoi diagram, these cells are clustered by a fuzzy C-means method (FCM) to reduce the transmission distance. Then an appropriate cluster head (CH) for each cluster is elected. Three parameters are used for this election process, the energy, distance between CH and its neighbor sensors and packet loss values. Furthermore, data aggregation is employed in each CH to reduce the amount of data transmission which le
... Show MoreIn this paper, time spent and the repetition of using the Social Network Sites (SNS) in Android applications are investigated. In this approach, we seek to raise the awareness and limit, but not eliminate the repeated uses of SNS, by introducing AndroidTrack. This AndroidTrack is an android application that was designed to monitor and apply valid experimental studies in order to improve the impacts of social media on Iraqi users. Data generated from the app were aggregated and updated periodically at Google Firebase Real-time Database. The statistical factor analysis (FA) was presented as a result of the user’s interactions.
In this paper, we investigate the automatic recognition of emotion in text. We perform experiments with a new method of classification based on the PPM character-based text compression scheme. These experiments involve both coarse-grained classification (whether a text is emotional or not) and also fine-grained classification such as recognising Ekman’s six basic emotions (Anger, Disgust, Fear, Happiness, Sadness, Surprise). Experimental results with three datasets show that the new method significantly outperforms the traditional word-based text classification methods. The results show that the PPM compression based classification method is able to distinguish between emotional and nonemotional text with high accuracy, between texts invo
... Show MoreA seemingly uncorrelated regression (SUR) model is a special case of multivariate models, in which the error terms in these equations are contemporaneously related. The method estimator (GLS) is efficient because it takes into account the covariance structure of errors, but it is also very sensitive to outliers. The robust SUR estimator can dealing outliers. We propose two robust methods for calculating the estimator, which are (S-Estimations, and FastSUR). We find that it significantly improved the quality of SUR model estimates. In addition, the results gave the FastSUR method superiority over the S method in dealing with outliers contained in the data set, as it has lower (MSE and RMSE) and higher (R-Squared and R-Square Adjus
... Show MoreThis research includes the study of dual data models with mixed random parameters, which contain two types of parameters, the first is random and the other is fixed. For the random parameter, it is obtained as a result of differences in the marginal tendencies of the cross sections, and for the fixed parameter, it is obtained as a result of differences in fixed limits, and random errors for each section. Accidental bearing the characteristic of heterogeneity of variance in addition to the presence of serial correlation of the first degree, and the main objective in this research is the use of efficient methods commensurate with the paired data in the case of small samples, and to achieve this goal, the feasible general least squa
... Show MoreThe reaction of LAs-Cl8 : [ (2,2- (1-(3,4-bis(carboxylicdichloromethoxy)-5-oxo-2,5- dihydrofuran-2-yl)ethane – 1,2-diyl)bis(2,2-dichloroacetic acid)]with sodium azide in ethanol with drops of distilled water has been investigated . The new product L-AZ :(3Z ,5Z,8Z)-2- azido-8-[azido(3Z,5Z)-2-azido-2,6-bis(azidocarbonyl)-8,9-dihydro-2H-1,7-dioxa-3,4,5- triazonine-9-yl]methyl]-9-[(1-azido-1-hydroxy)methyl]-2H-1,7-dioxa-3,4,5-triazonine – 2,6 – dicarbonylazide was isolated and characterized by elemental analysis (C.H.N) , 1H-NMR , Mass spectrum and Fourier transform infrared spectrophotometer (FT-IR) . The reaction of the L-AZ withM+n: [ ( VO(II) , Cr(III) ,Mn(II) , Co(II) , Ni(II) , Cu(II) , Zn(II) , Cd(II) and Hg(II)] has been i
... Show MoreThis paper presents a numerical scheme for solving nonlinear time-fractional differential equations in the sense of Caputo. This method relies on the Laplace transform together with the modified Adomian method (LMADM), compared with the Laplace transform combined with the standard Adomian Method (LADM). Furthermore, for the comparison purpose, we applied LMADM and LADM for solving nonlinear time-fractional differential equations to identify the differences and similarities. Finally, we provided two examples regarding the nonlinear time-fractional differential equations, which showed that the convergence of the current scheme results in high accuracy and small frequency to solve this type of equations.
Data compression offers an attractive approach to reducing communication costs using available bandwidth effectively. It makes sense to pursue research on developing algorithms that can most effectively use available network. It is also important to consider the security aspect of the data being transmitted is vulnerable to attacks. The basic aim of this work is to develop a module for combining the operation of compression and encryption on the same set of data to perform these two operations simultaneously. This is achieved through embedding encryption into compression algorithms since both cryptographic ciphers and entropy coders bear certain resemblance in the sense of secrecy. First in the secure compression module, the given text is p
... Show More