An approach for hiding information has been proposed for securing information using Slanlet transform and the T-codes. Same as the wavelet transform the Slantlet transform is better in compression signal and good time localization signal compression than the conventional transforms like (DCT) discrete cosine transforms. The proposed method provides efficient security, because the original secret image is encrypted before embedding in order to build a robust system that is no attacker can defeat it. Some of the well known fidelity measures like (PSNR and AR) were used to measure the quality of the Steganography image and the image after extracted. The results show that the stego-image is closed related to the cover image, with (PSNR) Peak Signal to Noise Ratio is about 55dB. The recovered secret image is extracted (100%) if stego-image has no attack. These methods can provide good hiding capacity and image quality. Several types of attacks have been applied to the proposed methods in order to measure the robustness like (compression, add noise and cropping). The proposed algorithm has been implemented by using computer simulation program MATLAB version 7.9 under windows 7 operating system by Microsoft cooperation.
BACKGROUND: The degree of the development of coronary collaterals is long considered an alternate–that is, a collateral–source of blood supply to an area of the myocardium threatened with vascular ischemia or insufficiency. Hence, the coronary collaterals are beneficial but can also promote harmful (adverse) effects. For instance, the coronary steal effect during the myocardial hyperemia phase and that of restenosis following coronary angioplasty.
The traditional centralized network management approach presents severe efficiency and scalability limitations in large scale networks. The process of data collection and analysis typically involves huge transfers of management data to the manager which cause considerable network throughput and bottlenecks at the manager side. All these problems processed using the Agent technology as a solution to distribute the management functionality over the network elements. The proposed system consists of the server agent that is working together with clients agents to monitor the logging (off, on) of the clients computers and which user is working on it. file system watcher mechanism is used to indicate any change in files. The results were presente
... Show MoreThe investigation of signature validation is crucial to the field of personal authenticity. The biometrics-based system has been developed to support some information security features.Aperson’s signature, an essential biometric trait of a human being, can be used to verify their identification. In this study, a mechanism for automatically verifying signatures has been suggested. The offline properties of handwritten signatures are highlighted in this study which aims to verify the authenticity of handwritten signatures whether they are real or forged using computer-based machine learning techniques. The main goal of developing such systems is to verify people through the validity of their signatures. In this research, images of a group o
... Show MoreRecently, the theory of Complex Networks gives a modern insight into a variety of applications in our life. Complex Networks are used to form complex phenomena into graph-based models that include nodes and edges connecting them. This representation can be analyzed by using network metrics such as node degree, clustering coefficient, path length, closeness, betweenness, density, and diameter, to mention a few. The topology of the complex interconnections of power grids is considered one of the challenges that can be faced in terms of understanding and analyzing them. Therefore, some countries use Complex Networks concepts to model their power grid networks. In this work, the Iraqi Power Grid network (IPG) has been modeled, visua
... Show MoreThe main aim of image compression is to reduce the its size to be able for transforming and storage, therefore many methods appeared to compress the image, one of these methods is "Multilayer Perceptron ". Multilayer Perceptron (MLP) method which is artificial neural network based on the Back-Propagation algorithm for compressing the image. In case this algorithm depends upon the number of neurons in the hidden layer only the above mentioned will not be quite enough to reach the desired results, then we have to take into consideration the standards which the compression process depend on to get the best results. We have trained a group of TIFF images with the size of (256*256) in our research, compressed them by using MLP for each
... Show MoreImage Fusion Using A Convolutional Neural Network
This paper explores VANET topics: architecture, characteristics, security, routing protocols, applications, simulators, and 5G integration. We update, edit, and summarize some of the published data as we analyze each notion. For ease of comprehension and clarity, we give part of the data as tables and figures. This survey also raises issues for potential future research topics, such as how to integrate VANET with a 5G cellular network and how to use trust mechanisms to enhance security, scalability, effectiveness, and other VANET features and services. In short, this review may aid academics and developers in choosing the key VANET characteristics for their objectives in a single document.
In this research Artificial Neural Network (ANN) technique was applied to study the filtration process in water treatment. Eight models have been developed and tested using data from a pilot filtration plant, working under different process design criteria; influent turbidity, bed depth, grain size, filtration rate and running time (length of the filtration run), recording effluent turbidity and head losses. The ANN models were constructed for the prediction of different performance criteria in the filtration process: effluent turbidity, head losses and running time. The results indicate that it is quite possible to use artificial neural networks in predicting effluent turbidity, head losses and running time in the filtration process, wi
... Show More