Computer systems and networks are being used in almost every aspect of our daily life, the security threats to computers and networks have increased significantly. Usually, password-based user authentication is used to authenticate the legitimate user. However, this method has many gaps such as password sharing, brute force attack, dictionary attack and guessing. Keystroke dynamics is one of the famous and inexpensive behavioral biometric technologies, which authenticate a user based on the analysis of his/her typing rhythm. In this way, intrusion becomes more difficult because the password as well as the typing speed must match with the correct keystroke patterns. This thesis considers static keystroke dynamics as a transparent layer of the user for user authentication. Back Propagation Neural Network (BPNN) and the Probabilistic Neural Network (PNN) are used as a classifier to discriminate between the authentic and impostor users. Furthermore, four keystroke dynamics features namely: Dwell Time (DT), Flight Time (FT), Up-Up Time (UUT), and a mixture of (DT) and (FT) are extracted to verify whether the users could be properly authenticated. Two datasets (keystroke-1) and (keystroke-2) are used to show the applicability of the proposed Keystroke dynamics user authentication system. The best results obtained with lowest false rates and highest accuracy when using UUT compared with DT and FT features and comparable to combination of DT and FT, because of UUT as one direct feature that implicitly contained the two other features DT, and FT; that lead to build a new feature from the previous two features making the last feature having more capability to discriminate the authentic users from the impostors. In addition, authentication with UUT alone instead of the combination of DT and FT reduce the complexity and computational time of the neural network when compared with combination of DT and FT features.
In this paper three techniques for image compression are implemented. The proposed techniques consist of three dimension (3-D) two level discrete wavelet transform (DWT), 3-D two level discrete multi-wavelet transform (DMWT) and 3-D two level hybrid (wavelet-multiwavelet transform) technique. Daubechies and Haar are used in discrete wavelet transform and Critically Sampled preprocessing is used in discrete multi-wavelet transform. The aim is to maintain to increase the compression ratio (CR) with respect to increase the level of the transformation in case of 3-D transformation, so, the compression ratio is measured for each level. To get a good compression, the image data properties, were measured, such as, image entropy (He), percent r
... Show MoreGypsum Plaster is an important building materials, and because of the availabilty of its raw materials. In this research the effect of various additives on the properties of plaster was studied , like Polyvinyl Acetate, Furfural, Fumed Silica at different rate of addition and two types of fibers, Carbon Fiber and Polypropylene Fiber to the plaster at a different volumetric rate. It was found that after analysis of the results the use of Furfural as an additive to plaster by 2.5% is the optimum ratio of addition to that it improved the flexural Strength by 3.18%.
When using Polyvinyl Acetate it was found that the ratio of the additive 2% is the optimum ratio of addition to the plaster, because it improved the value of the flexural stre
The process of risk assessment in the build-operate transfer (BOT) project is very important to identify and analyze the risks in order to make the appropriate decision to respond to them. In this paper, AHP Technique was used to make the appropriate decision regarding response to the most prominent risks that were generated in BOT projects, which includes a comparison between the criteria for each risk as well as the available alternatives and by mathematical methods using matrices to reach an appropriate decision to respond to each risk.Ten common risks in BOT contracts are adopted for analysis in this paper, which is grouped into six main risk headings.The procedures followed in this paper are the questionnaire method
... Show MoreAnalytical field target function has been considered to represent the axial magnetic field distribution of double polepiece symmetric magnetic lens. In this article, with aid of the proposed target function, the syntheses procedure is dependent. The effect of the main two coffectin optimization parameters on the lens field distribution, polepieces shape, and the objective focal prosperities for lenses operated under zero magnification mode has been studied. The results have shown that the objective properties evaluated in sense of the inverse design procedure are in an excellent correspondence with that of analysis approach. Where the optical properties enhance as the field distribution of the electron lens distributed along a narrow axi
... Show MoreObject tracking is one of the most important topics in the fields of image processing and computer vision. Object tracking is the process of finding interesting moving objects and following them from frame to frame. In this research, Active models–based object tracking algorithm is introduced. Active models are curves placed in an image domain and can evolve to segment the object of interest. Adaptive Diffusion Flow Active Model (ADFAM) is one the most famous types of Active Models. It overcomes the drawbacks of all previous versions of the Active Models specially the leakage problem, noise sensitivity, and long narrow hols or concavities. The ADFAM is well known for its very good capabilities in the segmentation process. In this
... Show MoreSome problems want to be solved in image compression to make the process workable and more efficient. Much work had been done in the field of lossy image compression based on wavelet and Discrete Cosine Transform (DCT). In this paper, an efficient image compression scheme is proposed, based on a common encoding transform scheme; It consists of the following steps: 1) bi-orthogonal (tab 9/7) wavelet transform to split the image data into sub-bands, 2) DCT to de-correlate the data, 3) the combined transform stage's output is subjected to scalar quantization before being mapped to positive, 4) and LZW encoding to produce the compressed data. The peak signal-to-noise (PSNR), compression ratio (CR), and compression gain (CG) measures were used t
... Show MoreIn this paper a decoder of binary BCH code is implemented using a PIC microcontroller for code length n=127 bits with multiple error correction capability, the results are presented for correcting errors up to 13 errors. The Berkelam-Massey decoding algorithm was chosen for its efficiency. The microcontroller PIC18f45k22 was chosen for the implementation and programmed using assembly language to achieve highest performance. This makes the BCH decoder implementable as a low cost module that can be used as a part of larger systems. The performance evaluation is presented in terms of total number of instructions and the bit rate.
Drilling fluid loss during drilling operation is undesirable, expensive and potentially hazardous problem.
Nasiriyah oil field is one of the Iraqi oil field that suffer from lost circulation problem. It is known that Dammam, um-Radoma, Tayarat, Shiranish and Hartha are the detecting layers of loss circulation problem. Different type of loss circulation materials (LCMs) ranging from granular, flakes and fibrous were used previously to treat this problem.
This study presents the application of rice as a lost circulation material that used to mitigate and stop the loss problem when partial or total losses occurred.
The experim
... Show More