Iris research is focused on developing techniques for identifying and locating relevant biometric features, accurate segmentation and efficient computation while lending themselves to compression methods. Most iris segmentation methods are based on complex modelling of traits and characteristics which, in turn, reduce the effectiveness of the system being used as a real time system. This paper introduces a novel parameterized technique for iris segmentation. The method is based on a number of steps starting from converting grayscale eye image to a bit plane representation, selection of the most significant bit planes followed by a parameterization of the iris location resulting in an accurate segmentation of the iris from the original image. A lossless Hexadata encoding method is then applied to the data, which is based on reducing each set of six data items to a single encoded value. The tested results achieved acceptable saving bytes performance for the 21 iris square images of sizes 256x256 pixels which is about 22.4 KB on average with 0.79 sec decompression average time, with high saving bytes performance for 2 iris non-square images of sizes 640x480/2048x1536 that reached 76KB/2.2 sec, 1630 KB/4.71 sec respectively, Finally, the proposed promising techniques standard lossless JPEG2000 compression techniques with reduction about 1.2 and more in KB saving that implicitly demonstrating the power and efficiency of the suggested lossless biometric techniques.
Polymer electrolytes were prepared using the solution cast technology. Under some conditions, the electrolyte content of polymers was analyzed in constant percent of PVA/PVP (50:50), ethylene carbonate (EC), and propylene carbonate (PC) (1:1) with different proportions of potassium iodide (KI) (10, 20, 30, 40, 50 wt%) and iodine (I2) = 10 wt% of salt. Fourier Transmission Infrared (FTIR) studies confirmed the complex formation of polymer blends. Electrical conductivity was calculated with an impedance analyzer in the frequency range 50 Hz–1MHz and in the temperature range 293–343 K. The highest electrical conductivity value of 5.3 × 10-3 (S/cm) was observed for electrolytes with 50 wt% KI concentration at room
... Show MoreA novel robust finite time disturbance observer (RFTDO) based on an independent output-finite time composite control (FTCC) scheme is proposed for an air conditioning-system temperature and humidity regulation. The variable air volume (VAV) of the system is represented by two first-order mathematical models for the temperature and humidity dynamics. In the temperature loop dynamics, a RFTDO temperature (RFTDO-T) and an FTCC temperature (FTCC-T) are designed to estimate and reject the lumped disturbances of the temperature subsystem. In the humidity loop, a robust output of the FTCC humidity (FTCC-H) and RFTDO humidity (RFTDO-H) are also designed to estimate and reject the lumped disturbances of the humidity subsystem. Based on Lyapunov theo
... Show MoreThe aim of this essay is to use a single-index model in developing and adjusting Fama-MacBeth. Penalized smoothing spline regression technique (SIMPLS) foresaw this adjustment. Two generalized cross-validation techniques, Generalized Cross Validation Grid (GGCV) and Generalized Cross Validation Fast (FGCV), anticipated the regular value of smoothing covered under this technique. Due to the two-steps nature of the Fama-MacBeth model, this estimation generated four estimates: SIMPLS(FGCV) - SIMPLS(FGCV), SIMPLS(FGCV) - SIM PLS(GGCV), SIMPLS(GGCV) - SIMPLS(FGCV), SIM PLS(GGCV) - SIM PLS(GGCV). Three-factor Fama-French model—market risk premium, size factor, value factor, and their implication for excess stock returns and portfolio return
... Show MoreThe present study discusses the problem based learning in Iraqi classroom. This method aims to involve all learners in collaborative activities and it is learner-centered method. To fulfill the aims and verify the hypothesis which reads as follow” It is hypothesized that there is no statistically significant differences between the achievements of Experimental group and control group”. Thirty learners are selected to be the sample of present study.Mann-Whitney Test for two independent samples is used to analysis the results. The analysis shows that experimental group’s members who are taught according to problem based learning gets higher scores than the control group’s members who are taught according to traditional method. This
... Show MoreDigital Elevation Model (DEM) is one of the developed techniques for relief representation. The definition of a DEM construction is the modeling technique of earth surface from existing data. DEM plays a role as one of the fundamental information requirement that has been generally utilized in GIS data structures. The main aim of this research is to present a methodology for assessing DEMs generation methods. The DEMs data will be extracted from open source data e.g. Google Earth. The tested data will be compared with data produced from formal institutions such as General Directorate of Surveying. The study area has been chosen in south of Iraq (Al-Gharraf / Dhi Qar governorate. The methods of DEMs creation are kriging, IDW (inver
... Show MoreIn modern era, which requires the use of networks in the transmission of data across distances, the transport or storage of such data is required to be safe. The protection methods are developed to ensure data security. New schemes are proposed that merge crypto graphical principles with other systems to enhance information security. Chaos maps are one of interesting systems which are merged with cryptography for better encryption performance. Biometrics is considered an effective element in many access security systems. In this paper, two systems which are fingerprint biometrics and chaos logistic map are combined in the encryption of a text message to produce strong cipher that can withstand many types of attacks. The histogram analysis o
... Show More