Data hiding is the process of encoding extra information in an image by making small modification to its pixels. To be practical, the hidden data must be perceptually invisible yet robust to common signal processing operations. This paper introduces a scheme for hiding a signature image that could be as much as 25% of the host image data and hence could be used both in digital watermarking as well as image/data hiding. The proposed algorithm uses orthogonal discrete wavelet transforms with two zero moments and with improved time localization called discrete slantlet transform for both host and signature image. A scaling factor ? in frequency domain control the quality of the watermarked images. Experimental results of signature image recovery after applying JPEG coding to the watermarking image are included.
Using the Neural network as a type of associative memory will be introduced in this paper through the problem of mobile position estimation where mobile estimate its location depending on the signal strength reach to it from several around base stations where the neural network can be implemented inside the mobile. Traditional methods of time of arrival (TOA) and received signal strength (RSS) are used and compared with two analytical methods, optimal positioning method and average positioning method. The data that are used for training are ideal since they can be obtained based on geometry of CDMA cell topology. The test of the two methods TOA and RSS take many cases through a nonlinear path that MS can move through that region. The result
... Show MoreIt is the regression analysis is the foundation stone of knowledge of statistics , which mostly depends on the ordinary least square method , but as is well known that the way the above mentioned her several conditions to operate accurately and the results can be unreliable , add to that the lack of certain conditions make it impossible to complete the work and analysis method and among those conditions are the multi-co linearity problem , and we are in the process of detected that problem between the independent variables using farrar –glauber test , in addition to the requirement linearity data and the lack of the condition last has been resorting to the
... Show MoreA content-based image retrieval (CBIR) is a technique used to retrieve images from an image database. However, the CBIR process suffers from less accuracy to retrieve images from an extensive image database and ensure the privacy of images. This paper aims to address the issues of accuracy utilizing deep learning techniques as the CNN method. Also, it provides the necessary privacy for images using fully homomorphic encryption methods by Cheon, Kim, Kim, and Song (CKKS). To achieve these aims, a system has been proposed, namely RCNN_CKKS, that includes two parts. The first part (offline processing) extracts automated high-level features based on a flatting layer in a convolutional neural network (CNN) and then stores these features in a
... Show MoreAn oil spill is a leakage of pipelines, vessels, oil rigs, or tankers that leads to the release of petroleum products into the marine environment or on land that happened naturally or due to human action, which resulted in severe damages and financial loss. Satellite imagery is one of the powerful tools currently utilized for capturing and getting vital information from the Earth's surface. But the complexity and the vast amount of data make it challenging and time-consuming for humans to process. However, with the advancement of deep learning techniques, the processes are now computerized for finding vital information using real-time satellite images. This paper applied three deep-learning algorithms for satellite image classification
... Show MoreThe need for an efficient method to find the furthermost appropriate document corresponding to a particular search query has become crucial due to the exponential development in the number of papers that are now readily available to us on the web. The vector space model (VSM) a perfect model used in “information retrieval”, represents these words as a vector in space and gives them weights via a popular weighting method known as term frequency inverse document frequency (TF-IDF). In this research, work has been proposed to retrieve the most relevant document focused on representing documents and queries as vectors comprising average term term frequency inverse sentence frequency (TF-ISF) weights instead of representing them as v
... Show MoreA new method based on the Touchard polynomials (TPs) was presented for the numerical solution of the linear Fredholm integro-differential equation (FIDE) of the first order and second kind with condition. The derivative and integration of the (TPs) were simply obtained. The convergence analysis of the presented method was given and the applicability was proved by some numerical examples. The results obtained in this method are compared with other known results.
In this paper, we describe the cases of marriage and divorce in the city of Baghdad on both sides of Rusafa and Karkh, we collected the data in this research from the Supreme Judicial Council and used the cubic spline interpolation method to estimate the function that passing through given points as well as the extrapolation method which was applied for estimating the cases of marriage and divorce for the next year and comparison between Rusafa and Karkh by using the MATLAB program.
Oxidation of sulfur compounds in fuel followed by an adsorption process were studied using two modes of operation, batch mode and continuous mode (fixed bed). In batch experiment oxidation process of kerosene with sulfur content 2360 ppm was achieved to study the effect of amount of hydrogen peroxide(2.5, 4, 6 and 10) ml at different temperature(40, 60 and 70)°C. Also the effect of amount acetic acid was studied at the optimal conditions of the oxidation step(4ml H2O2 and 60 °C).Besides, the role of acetic acid different temperatures(40, 60, 70) °C and 4ml H2O2, effect of reaction time(5, 30, 60, 120, 300) minutes at temperatures(40,60) °C, 4ml H2O2 and 1 mlHAC)&
... Show MoreIn this study, plain concrete simply supported beams subjected to two points loading were analyzed for the flexure. The numerical model of the beam was constructed in the meso-scale representation of concrete as a two phasic material (aggregate, and mortar). The fracture process of the concrete beams under loading was investigated in the laboratory as well as by the numerical models. The Extended Finite Element Method (XFEM) was employed for the treatment of the discontinuities that appeared during the fracture process in concrete. Finite element method with the feature standard/explicitlywas utilized for the numerical analysis. Aggregate particles were assumedof elliptic shape. Other properties such as grading and sizes of the aggr
... Show More