Polyaniline Multi wall Carbon nanotube (PANI/MWCNTs) nanocomposite thin films have been prepared by Plasma jet polymerization at low frequency on glass substrate with preliminary deposited aluminum electrodes to form Al/PANI-MWCNT/Al surface-type capacitive humidity sensors, the gap between the electrodes about 50 μm and the MWCNTs weight concentration varied between 0, 1, 2, 3, 4%. The diameter of the MWCNTs was in the range of 8-15 nm and the length 10-55 μm. The capacitance-humidity relationships of the sensors were investigated at humidity levels from 35 to 90% RH. The electrical properties showed that the capacity increased with increasing relative humidity, and that the sensitivity of the sensor increases with the increase of the additive (MWCNTs); while each of the response time and the recovery time increasing with concentration. The change in MWCNTs concentration leads to a change in the energy gap as well as the initial capacity. The capacitance increases linearly with the relative humidity at MWCNTs concentration of 3% for thus the possibility of manufacturing humidity sensor with good specifications at this concentration.
Image pattern classification is considered a significant step for image and video processing. Although various image pattern algorithms have been proposed so far that achieved adequate classification, achieving higher accuracy while reducing the computation time remains challenging to date. A robust image pattern classification method is essential to obtain the desired accuracy. This method can be accurately classify image blocks into plain, edge, and texture (PET) using an efficient feature extraction mechanism. Moreover, to date, most of the existing studies are focused on evaluating their methods based on specific orthogonal moments, which limits the understanding of their potential application to various Discrete Orthogonal Moments (DOM
... Show MoreA Multiple System Biometric System Based on ECG Data
Abstract: In this paper, a U-shaped probe with a curvature diameter of half a centimeter was implemented using plastic optical fibers. A layer of the outer shell of the fibers was removed by polishing to a D-section. The sensor was tested by immersing it in a sodium chloride solution with variable refractive index depending on solution concentrations ranging from 1.333 to 1.363. In this design, the sensor experienced a decrease in its intensity as the concentration of the solution increased. The next step The sensor was coated with a thin layer of gold with a thickness of 20 nm, and the sensor was tested with the same solutions which resulted in a shift in wavelengths where the shift in wavelength was 5.37 nm and sensiti
... Show MoreThis paper interest to estimation the unknown parameters for generalized Rayleigh distribution model based on censored samples of singly type one . In this paper the probability density function for generalized Rayleigh is defined with its properties . The maximum likelihood estimator method is used to derive the point estimation for all unknown parameters based on iterative method , as Newton – Raphson method , then derive confidence interval estimation which based on Fisher information matrix . Finally , testing whether the current model ( GRD ) fits to a set of real data , then compute the survival function and hazard function for this real data.
In recent years, the performance of Spatial Data Infrastructures for governments and companies is a task that has gained ample attention. Different categories of geospatial data such as digital maps, coordinates, web maps, aerial and satellite images, etc., are required to realize the geospatial data components of Spatial Data Infrastructures. In general, there are two distinct types of geospatial data sources exist over the Internet: formal and informal data sources. Despite the growth of informal geospatial data sources, the integration between different free sources is not being achieved effectively. The adoption of this task can be considered the main advantage of this research. This article addresses the research question of ho
... Show MoreStenography is the art of hiding the very presence of communication by embedding secret message into innocuous looking cover document, such as digital image, videos, sound files, and other computer files that contain perceptually irrelevant or redundant information as covers or carriers to hide secret messages.
In this paper, a new Least Significant Bit (LSB) nonsequential embedding technique in wave audio files is introduced. To support the immunity of proposed hiding system, and in order to recover some weak aspect inherent with the pure implementation of stego-systems, some auxiliary processes were suggested and investigated including the use of hidden text jumping process and stream ciphering algorithm. Besides, the suggested
... Show MoreBiometrics is widely used with security systems nowadays; each biometric modality can be useful and has distinctive properties that provide uniqueness and ambiguity for security systems especially in communication and network technologies. This paper is about using biometric features of fingerprint, which is called (minutiae) to cipher a text message and ensure safe arrival of data at receiver end. The classical cryptosystems (Caesar, Vigenère, etc.) became obsolete methods for encryption because of the high-performance machines which focusing on repetition of the key in their attacks to break the cipher. Several Researchers of cryptography give efforts to modify and develop Vigenère cipher by enhancing its weaknesses.
... Show MoreTo achieve safe security to transfer data from the sender to receiver, cryptography is one way that is used for such purposes. However, to increase the level of data security, DNA as a new term was introduced to cryptography. The DNA can be easily used to store and transfer the data, and it becomes an effective procedure for such aims and used to implement the computation. A new cryptography system is proposed, consisting of two phases: the encryption phase and the decryption phase. The encryption phase includes six steps, starting by converting plaintext to their equivalent ASCII values and converting them to binary values. After that, the binary values are converted to DNA characters and then converted to their equivalent complementary DN
... Show More