<p>Generally, The sending process of secret information via the transmission channel or any carrier medium is not secured. For this reason, the techniques of information hiding are needed. Therefore, steganography must take place before transmission. To embed a secret message at optimal positions of the cover image under spatial domain, using the developed particle swarm optimization algorithm (Dev.-PSO) to do that purpose in this paper based on Least Significant Bits (LSB) using LSB substitution. The main aim of (Dev. -PSO) algorithm is determining an optimal paths to reach a required goals in the specified search space based on disposal of them, using (Dev.-PSO) algorithm produces the paths of a required goals with most efficient and speed. An agents population is used in determining process of a required goals at search space for solving of problem. The (Dev.-PSO) algorithm is applied to different images; the number of an image which used in the experiments in this paper is three. For all used images, the Peak Signal to Noise Ratio (PSNR) value is computed. Finally, the PSNR value of the stego-A that obtained from blue sub-band colo is equal (44.87) dB, while the stego-B is equal (44.45) dB, and the PSNR value for the stego-C is (43.97)dB, while the vlue of MSE that obtained from the same color sub-bans is (0.00989), stego-B equal to (0.01869), and stego-C is (0.02041). Furthermore, our proposed method has ability to survive the quality for the stego image befor and after hiding stage or under intended attack that used in the existing paper such as Gaussian noise, and salt & pepper noise.</p>
Semantic segmentation realization and understanding is a stringent task not just for computer vision but also in the researches of the sciences of earth, semantic segmentation decompose compound architectures in one elements, the most mutual object in a civil outside or inside senses must classified then reinforced with information meaning of all object, it’s a method for labeling and clustering point cloud automatically. Three dimensions natural scenes classification need a point cloud dataset to representation data format as input, many challenge appeared with working of 3d data like: little number, resolution and accurate of three Dimensional dataset . Deep learning now is the po
In this study, structures damage identification method based on changes in the dynamic characteristics
(frequencies) of the structure are examined, stiffness as well as mass matrices of the curved
(in and out-of-plane vibration) beam elements is formulated using Hamilton's principle. Each node
of both of them possesses seven degrees of freedom including the warping degree of freedom. The
curved beam element had been derived based on the Kang and Yoo’s thin-walled curved beam theory
in 1994. A computer program was developing to carry out free vibration analyses of the curved
beam as well as straight beam. Comparing with the frequencies for other researchers using the general
purpose program MATLAB. Fuzzy logic syste
The objective of this study is to apply Artificial Neural Network for heat transfer analysis of shell-and-tube heat exchangers widely used in power plants and refineries. Practical data was obtained by using industrial heat exchanger operating in power generation department of Dura refinery. The commonly used Back Propagation (BP) algorithm was used to train and test networks by divided the data to three samples (training, validation and testing data) to give more approach data with actual case. Inputs of the neural network include inlet water temperature, inlet air temperature and mass flow rate of air. Two outputs (exit water temperature to cooling tower and exit air temperature to second stage of air compressor) were taken in ANN.
... Show MoreIn this paper, we present multiple bit error correction coding scheme based on extended Hamming product code combined with type II HARQ using shared resources for on chip interconnect. The shared resources reduce the hardware complexity of the encoder and decoder compared to the existing three stages iterative decoding method for on chip interconnects. The proposed method of decoding achieves 20% and 28% reduction in area and power consumption respectively, with only small increase in decoder delay compared to the existing three stage iterative decoding scheme for multiple bit error correction. The proposed code also achieves excellent improvement in residual flit error rate and up to 58% of total power consumption compared to the other err
... Show MoreAudio classification is the process to classify different audio types according to contents. It is implemented in a large variety of real world problems, all classification applications allowed the target subjects to be viewed as a specific type of audio and hence, there is a variety in the audio types and every type has to be treatedcarefully according to its significant properties.Feature extraction is an important process for audio classification. This workintroduces several sets of features according to the type, two types of audio (datasets) were studied. Two different features sets are proposed: (i) firstorder gradient feature vector, and (ii) Local roughness feature vector, the experimentsshowed that the results are competitive to
... Show MoreWatermarking operation can be defined as a process of embedding special wanted and reversible information in important secure files to protect the ownership or information of the wanted cover file based on the proposed singular value decomposition (SVD) watermark. The proposed method for digital watermark has very huge domain for constructing final number and this mean protecting watermark from conflict. The cover file is the important image need to be protected. A hidden watermark is a unique number extracted from the cover file by performing proposed related and successive operations, starting by dividing the original image into four various parts with unequal size. Each part of these four treated as a separate matrix and applying SVD
... Show MoreThe purpose of this research is defining the main factors influencing on decision of management system on sensitive data in cloud. The framework is proposed to enhance management information systems decision on sensitive information in cloud environment. The structured interview with several security experts working on cloud computing security to investigate the main objective of framework and suitability of instrument, a pilot study conducts to test the instrument. The validity and reliability test results expose that study can be expanded and lead to final framework validation. This framework using multilevel related to Authorization, Authentication, Classification and identity anonymity, and save and verify, to enhance management
... Show More