Nanostructure of chromium oxide (Cr2O3-NPs) with rhombohedral structure were successfully prepared by spray pyrolysis technique using Aqueous solution of Chromium (III) chloride CrCl3 as solution. The films were deposited on glass substrates heated to 450°C using X-ray diffraction (XRD) shows the nature of polycrystalline samples. The calculated lattice constant value for the grown Cr2O3 nanostructures is a = b = 4.959 Å & c = 13.594 Å and the average crystallize size (46.3-55.6) nm calculated from diffraction peaks, Spectral analysis revealed FTIR peak characteristic vibrations of Cr-O Extended and Two sharp peaks present at 630 and 578 cm-1 attributed to Cr-O “stretching modes”, are clear evidence of the presence of crystalline Cr2O3. The energy band gap (3.4 eV) for the chromium oxide nanostructures was measured using the UV-VIS-NIR Optical Spectrophotometer. It was found that by scanning electron microscopy (SEM) and image results, there is a large amount of nanostructure with an average crystal size of 46.3-55.6 nm, which indicates that our synthesis process is a successful method for preparing Cr2O3 nanoparticles.
Many designs have been suggested for unipolar magnetic lenses based on changing the width of the inner bore and fixing the other geometrical parameters of the lens to improve the performance of unipolar magnetic lenses. The investigation of a study of each design included the calculation of its axial magnetic field the magnetization of the lens in addition to the magnetic flux density using the Finite Element Method (FEM) the Magnetic Electron Lenses Operation (MELOP) program version 1 at three different values of current density (6,4,2 A/mm2). As a result, the clearest values and behaviors were obtained at current density (2 A/mm2). it was found that the best magnetizing properties, the high
... Show MoreThe Adaptive Optics technique has been developed to obtain the correction of atmospheric seeing. The purpose of this study is to use the MATLAB program to investigate the performance of an AO system with the most recent AO simulation tools, Objected-Oriented Matlab Adaptive Optics (OOMAO). This was achieved by studying the variables that impact image quality correction, such as observation wavelength bands, atmospheric parameters, telescope parameters, deformable mirror parameters, wavefront sensor parameters, and noise parameters. The results presented a detailed analysis of the factors that influence the image correction process as well as the impact of the AO components on that process
Image segmentation can be defined as a cutting or segmenting process of the digital image into many useful points which are called segmentation, that includes image elements contribute with certain attributes different form Pixel that constitute other parts. Two phases were followed in image processing by the researcher in this paper. At the beginning, pre-processing image on images was made before the segmentation process through statistical confidence intervals that can be used for estimate of unknown remarks suggested by Acho & Buenestado in 2018. Then, the second phase includes image segmentation process by using "Bernsen's Thresholding Technique" in the first phase. The researcher drew a conclusion that in case of utilizing
... Show MoreFractal image compression gives some desirable properties like fast decoding image, and very good rate-distortion curves, but suffers from a high encoding time. In fractal image compression a partitioning of the image into ranges is required. In this work, we introduced good partitioning process by means of merge approach, since some ranges are connected to the others. This paper presents a method to reduce the encoding time of this technique by reducing the number of range blocks based on the computing the statistical measures between them . Experimental results on standard images show that the proposed method yields minimize (decrease) the encoding time and remain the quality results passable visually.
Fractal image compression depends on representing an image using affine transformations. The main concern for researches in the discipline of fractal image compression (FIC) algorithm is to decrease encoding time needed to compress image data. The basic technique is that each portion of the image is similar to other portions of the same image. In this process, there are many models that were developed. The presence of fractals was initially noticed and handled using Iterated Function System (IFS); that is used for encoding images. In this paper, a review of fractal image compression is discussed with its variants along with other techniques. A summarized review of contributions is achieved to determine the fulfillment of fractal ima
... Show MoreInformation security is a crucial factor when communicating sensitive information between two parties. Steganography is one of the most techniques used for this purpose. This paper aims to enhance the capacity and robustness of hiding information by compressing image data to a small size while maintaining high quality so that the secret information remains invisible and only the sender and recipient can recognize the transmission. Three techniques are employed to conceal color and gray images, the Wavelet Color Process Technique (WCPT), Wavelet Gray Process Technique (WGPT), and Hybrid Gray Process Technique (HGPT). A comparison between the first and second techniques according to quality metrics, Root-Mean-Square Error (RMSE), Compression-
... Show MoreThree-dimensional (3D) image and medical image processing, which are considered big data analysis, have attracted significant attention during the last few years. To this end, efficient 3D object recognition techniques could be beneficial to such image and medical image processing. However, to date, most of the proposed methods for 3D object recognition experience major challenges in terms of high computational complexity. This is attributed to the fact that the computational complexity and execution time are increased when the dimensions of the object are increased, which is the case in 3D object recognition. Therefore, finding an efficient method for obtaining high recognition accuracy with low computational complexity is essentia
... Show More