Information hiding strategies have recently gained popularity in a variety of fields. Digital audio, video, and images are increasingly being labelled with distinct but undetectable marks that may contain a hidden copyright notice or serial number, or even directly help to prevent unauthorized duplication. This approach is extended to medical images by hiding secret information in them using the structure of a different file format. The hidden information may be related to the patient. In this paper, a method for hiding secret information in DICOM images is proposed based on Discrete Wavelet Transform (DWT). Firstly. segmented all slices of a 3D-image into a specific block size and collecting the host image depend on a generated key, secondly selected the block number and slice number, thirdly, the low-high band used for embedding after adding the generated number, fourthly, used the Hessenberg transform on the blocks that portioned the band (low-high) in a specific size. The secret information (image or text) is a binary value. It was embedded by setting the positive value in the diagonal to odd values if the embedded is one and setting it to even if the secret bit is zero. Several tests were applied, such as applying mean square error, peak signal to noise ratio PSNR, and structural similarity index measure SSIM. Some analyses such as adding noise, scaling, and rotation analysis are applied to test the efficiency. The results of the tests showed the strength of the proposed method.
The smart city concept has attracted high research attention in recent years within diverse application domains, such as crime suspect identification, border security, transportation, aerospace, and so on. Specific focus has been on increased automation using data driven approaches, while leveraging remote sensing and real-time streaming of heterogenous data from various resources, including unmanned aerial vehicles, surveillance cameras, and low-earth-orbit satellites. One of the core challenges in exploitation of such high temporal data streams, specifically videos, is the trade-off between the quality of video streaming and limited transmission bandwidth. An optimal compromise is needed between video quality and subsequently, rec
... Show MoreThis paper uses Artificial Intelligence (AI) based algorithm analysis to classify breast cancer Deoxyribonucleic (DNA). Main idea is to focus on application of machine and deep learning techniques. Furthermore, a genetic algorithm is used to diagnose gene expression to reduce the number of misclassified cancers. After patients' genetic data are entered, processing operations that require filling the missing values using different techniques are used. The best data for the classification process are chosen by combining each technique using the genetic algorithm and comparing them in terms of accuracy.
<span>Distributed denial-of-service (DDoS) attack is bluster to network security that purpose at exhausted the networks with malicious traffic. Although several techniques have been designed for DDoS attack detection, intrusion detection system (IDS) It has a great role in protecting the network system and has the ability to collect and analyze data from various network sources to discover any unauthorized access. The goal of IDS is to detect malicious traffic and defend the system against any fraudulent activity or illegal traffic. Therefore, IDS monitors outgoing and incoming network traffic. This paper contains a based intrusion detection system for DDoS attack, and has the ability to detect the attack intelligently, dynami
... Show MoreThe phenomena of Dust storm take place in barren and dry regions all over the world. It may cause by intense ground winds which excite the dust and sand from soft, arid land surfaces resulting it to rise up in the air. These phenomena may cause harmful influences upon health, climate, infrastructure, and transportation. GIS and remote sensing have played a key role in studying dust detection. This study was conducted in Iraq with the objective of validating dust detection. These techniques have been used to derive dust indices using Normalized Difference Dust Index (NDDI) and Middle East Dust Index (MEDI), which are based on images from MODIS and in-situ observation based on hourly wi
Storing, transferring, and processing high-dimensional electroencephalogram (EGG) signals is a critical challenge. The goal of EEG compression is to remove redundant data in EEG signals. Medical signals like EEG must be of high quality for medical diagnosis. This paper uses a compression system with near-zero Mean Squared Error (MSE) based on Discrete Cosine Transform (DCT) and double shift coding for fast and efficient EEG data compression. This paper investigates and compares the use or non-use of delta modulation, which is applied to the transformed and quantized input signal. Double shift coding is applied after mapping the output to positive as a final step. The system performance is tested using EEG data files from the C
... Show MoreThe gravity and magnetic data of Tikrit-Kirkuk area in central Iraq were considered to study the tectonic situation in the area. The residual anomalies were separated from regional using space windows method with space of about 24, 12 and 10km to delineate the source level of the residual anomalies. The Total Horizontal Derivative (THD) is used to identify the fault trends in the basement and sedimentary rocks depending upon gravity and magnetic data. The identified faults in the study area show (NW-SE), less common (NE-SW) and rare (N-S) trends. Some of these faults extending from the basement to the upper most layer of the sedimentary rocks. It was found that the depth of some gravity and magnetic source range 12-13Km, which confirm th
... Show MoreIt is so much noticeable that initialization of architectural parameters has a great impact on whole learnability stream so that knowing mathematical properties of dataset results in providing neural network architecture a better expressivity and capacity. In this paper, five random samples of the Volve field dataset were taken. Then a training set was specified and the persistent homology of the dataset was calculated to show impact of data complexity on selection of multilayer perceptron regressor (MLPR) architecture. By using the proposed method that provides a well-rounded strategy to compute data complexity. Our method is a compound algorithm composed of the t-SNE method, alpha-complexity algorithm, and a persistence barcod
... Show MoreIn this paper a hybrid system was designed for securing transformed or stored text messages(Arabic and english) by embedding the message in a colored image as a cover file depending on LSB (Least Significant Bit) algorithm in a dispersed way and employing Hill data encryption algorithm for encrypt message before being hidden, A key of 3x3 was used for encryption with inverse for decryption, The system scores a good result for PSNR rate ( 75-86) that differentiates according to length of message and image resolution
In this paper a hybrid system was designed for securing transformed or stored text messages(Arabic and english) by embedding the message in a colored image as a cover file depending on LSB (Least Significant Bit) algorithm in a dispersed way and employing Hill data encryption algorithm for encrypt message before being hidden, A key of 3x3 was used for encryption with inverse for decryption, The system scores a good result for PSNR rate ( 75-86) that differentiates according to length of message and image resolution.
The digital revolution had greatly affected the methods through which we communicate, starting from the basic concepts of the internet technology and the web content in addition to the important issues that concern the culture of the digital media, the internet governance and the variation in the digital age in general and the graphic and internal design in particular.
This research addresses an important topic that goes along with the scientific development in the field of the digital design, especially in the internal and graphic designs. This study consists of two sections: the first includes the problem of the study and the need for it. Starting from the problem of the research, there is no clear perception of the formal characte