Polyaniline nanofibers (PAni-NFs) have been synthesized under various concentrations (0.12, 0.16, and 0.2 g/l) of aniline and different times (2h and 3 h) by hydrothermal method at 90°C. Was conducted with the use of X-ray diffraction (XRD), Fourier Transform Infrared spectra (FTIR), Ultraviolet-Visible (UV-VIS) absorption spectra, Thermogravimetric Analysis (TGA), and Field Emission-Scanning Electron Microscopy (FE-SEM). The X-ray diffraction patterns revealed the amorphous nature of all the produced samples. FE-SEM demonstrated that Polyaniline has a nanofiber-like structure. The observed typical peaks of PAni were (1580, 1300-1240, and 821 cm-1 ), analyzed by the chemical bonding of the formed PAni through FTIR spectroscopy. Also, tests indicated the promotion of the thermal stability of polyaniline nano-composite at temperatures above 600°C. Still, the PAni-0.12 g/l sample was better than the other samples, and the optical parameters manifested a decrease in the band gap (Eg) bandgap. The observed TGA test findings also promoted Polyaniline's thermal stability at temperatures reaching 600°C.
Require senior management of the state when developing the strategy the general budget to the clarity of the reasons that are based upon the ministries in the preparation of estimates of expenditure for the next year to justify the spending, and in the absence of targets, the ministry or government unit would not be in front, but be guided by the size of expenditure for the last year in addition to the percentage of represented an increase of appropriations required for the next year in light of the fiscal policy of the state, so the requested increase in appropriations to meet the desired increase in some of its activities and to meet the increase in salaries and prices. So must be available to the Ministry of criteria that coul
... Show MoreA novel median filter based on crow optimization algorithms (OMF) is suggested to reduce the random salt and pepper noise and improve the quality of the RGB-colored and gray images. The fundamental idea of the approach is that first, the crow optimization algorithm detects noise pixels, and that replacing them with an optimum median value depending on a criterion of maximization fitness function. Finally, the standard measure peak signal-to-noise ratio (PSNR), Structural Similarity, absolute square error and mean square error have been used to test the performance of suggested filters (original and improved median filter) used to removed noise from images. It achieves the simulation based on MATLAB R2019b and the resul
... Show MoreAuthentication is the process of determining whether someone or something is, in fact, who or what it is declared to be. As the dependence upon computers and computer networks grows, the need for user authentication has increased. User’s claimed identity can be verified by one of several methods. One of the most popular of these methods is represented by (something user know), such as password or Personal Identification Number (PIN). Biometrics is the science and technology of authentication by identifying the living individual’s physiological or behavioral attributes. Keystroke authentication is a new behavioral access control system to identify legitimate users via their typing behavior. The objective of this paper is to provide user
... Show MoreIris research is focused on developing techniques for identifying and locating relevant biometric features, accurate segmentation and efficient computation while lending themselves to compression methods. Most iris segmentation methods are based on complex modelling of traits and characteristics which, in turn, reduce the effectiveness of the system being used as a real time system. This paper introduces a novel parameterized technique for iris segmentation. The method is based on a number of steps starting from converting grayscale eye image to a bit plane representation, selection of the most significant bit planes followed by a parameterization of the iris location resulting in an accurate segmentation of the iris from the origin
... Show MoreToday with increase using social media, a lot of researchers have interested in topic extraction from Twitter. Twitter is an unstructured short text and messy that it is critical to find topics from tweets. While topic modeling algorithms such as Latent Semantic Analysis (LSA) and Latent Dirichlet Allocation (LDA) are originally designed to derive topics from large documents such as articles, and books. They are often less efficient when applied to short text content like Twitter. Luckily, Twitter has many features that represent the interaction between users. Tweets have rich user-generated hashtags as keywords. In this paper, we exploit the hashtags feature to improve topics learned