The massive distribution and development in the digital images field with friendly software, that leads to produce unauthorized use. Therefore the digital watermarking as image authentication has been developed for those issues. In this paper, we presented a method depending on the embedding stage and extraction stag. Our development is made by combining Discrete Wavelet Transform (DWT) with Discrete Cosine Transform (DCT) depending on the fact that combined the two transforms will reduce the drawbacks that appears during the recovered watermark or the watermarked image quality of each other, that results in effective rounding method, this is achieved by changing the wavelets coefficients of selected DWT sub bands (HL or HH), followed by applying DCT transform on the selected sub band's coefficients, this method focuses on the invisibility for the embedded watermark bits, and the quality for the watermarked image; furthermore it focuses on a subjective for the recovered watermark after extraction stage. The proposed method was evaluated by using simple image quality matrix illustrated in the results, and it was found that the proposed method provide good objective quality, the recovered watermark extracted successfully and the quality of recovered watermark are survived.
In this work, we are obviously interested in a general solution for the calculation of the image of a single bar in partially coherent illumination. The solution is based on the theory of Hopkins for the formation of images in optical instruments in which it was shown that for all practical cases, the illumination of the object may be considered as due to a self – luminous source placed at the exit pupil of the condenser , and the diffraction integral describing the intensity distribution in the image of a single bar – as an object with half – width (U0 = 8 ) and circular aperture geometry is viewed , which by suitable choice of the coherence parameters (S=0.25,1.0.4.0) can be fitted to the observed distribution in various types of mi
... Show MoreImage compression is one of the data compression types applied to digital images in order to reduce their high cost for storage and/or transmission. Image compression algorithms may take the benefit of visual sensitivity and statistical properties of image data to deliver superior results in comparison with generic data compression schemes, which are used for other digital data. In the first approach, the input image is divided into blocks, each of which is 16 x 16, 32 x 32, or 64 x 64 pixels. The blocks are converted first into a string; then, encoded by using a lossless and dictionary-based algorithm known as arithmetic coding. The more occurrence of the pixels values is codded in few bits compare with pixel values of less occurre
... Show MoreThe aim of our study is to reveal the effect of steel reinforcement details,tensile steel reinforcement ratio, compressed reinforcing steel ratio,reinforcing steel size, corner joint shape on the strength of reinforcedconcrete Fc' and delve into it for the most accurate details and concreteconnections about the behavior and resistance of the corner joint ofreinforced concrete, Depending on the available studies and sources inaddition to our study, we concluded that each of these effects had a clearrole in the behavior and resistance of the corner joint of reinforced concreteunder the influence of the negative moment and yield stress. A studyof the types of faults that can be reinforced angle joints obtains detailsand conditions of c
... Show MoreFlexible joint robot (FJR) manipulators can offer many attractive features over rigid manipulators, including light weight, safe operation, and high power efficiency. However, the tracking control of the FJR is challenging due to its inherent problems, such as underactuation, coupling, nonlinearities, uncertainties, and unknown external disturbances. In this article, a terminal sliding mode control (TSMC) is proposed for the FJR system to guarantee the finite-time convergence of the systems output, and to achieve the total robustness against the lumped disturbance and estimation error. By using two coordinate transformations, the FJR dynamics is turned into a canonical form. A cascaded finite-time sliding mode observer (CFTSMO) is construct
... Show MoreObjective of this work is the mixing between human biometric characteristics and unique attributes of the computer in order to protect computer networks and resources environments through the development of authentication and authorization techniques. In human biometric side has been studying the best methods and algorithms used, and the conclusion is that the fingerprint is the best, but it has some flaws. Fingerprint algorithm has been improved so that their performance can be adapted to enhance the clarity of the edge of the gully structures of pictures fingerprint, taking into account the evaluation of the direction of the nearby edges and repeat. In the side of the computer features, computer and its components like human have uniqu
... Show MoreIn the last few years, the literature conferred a great interest in studying the feasibility of using memristive devices for computing. Memristive devices are important in structure, dynamics, as well as functionalities of artificial neural networks (ANNs) because of their resemblance to biological learning in synapses and neurons regarding switching characteristics of their resistance. Memristive architecture consists of a number of metastable switches (MSSs). Although the literature covered a variety of memristive applications for general purpose computations, the effect of low or high conductance of each MSS was unclear. This paper focuses on finding a potential criterion to calculate the conductance of each MMS rather t
... Show MoreThe most significant function in oil exploration is determining the reservoir facies, which are based mostly on the primary features of rocks. Porosity, water saturation, and shale volume as well as sonic log and Bulk density are the types of input data utilized in Interactive Petrophysics software to compute rock facies. These data are used to create 15 clusters and four groups of rock facies. Furthermore, the accurate matching between core and well-log data is established by the neural network technique. In the current study, to evaluate the applicability of the cluster analysis approach, the result of rock facies from 29 wells derived from cluster analysis were utilized to redistribute the petrophysical properties for six units of Mishri
... Show MoreRecommender Systems are tools to understand the huge amount of data available in the internet world. Collaborative filtering (CF) is one of the most knowledge discovery methods used positively in recommendation system. Memory collaborative filtering emphasizes on using facts about present users to predict new things for the target user. Similarity measures are the core operations in collaborative filtering and the prediction accuracy is mostly dependent on similarity calculations. In this study, a combination of weighted parameters and traditional similarity measures are conducted to calculate relationship among users over Movie Lens data set rating matrix. The advantages and disadvantages of each measure are spotted. From the study, a n
... Show MoreAudio classification is the process to classify different audio types according to contents. It is implemented in a large variety of real world problems, all classification applications allowed the target subjects to be viewed as a specific type of audio and hence, there is a variety in the audio types and every type has to be treatedcarefully according to its significant properties.Feature extraction is an important process for audio classification. This workintroduces several sets of features according to the type, two types of audio (datasets) were studied. Two different features sets are proposed: (i) firstorder gradient feature vector, and (ii) Local roughness feature vector, the experimentsshowed that the results are competitive to
... Show More