Iris research is focused on developing techniques for identifying and locating relevant biometric features, accurate segmentation and efficient computation while lending themselves to compression methods. Most iris segmentation methods are based on complex modelling of traits and characteristics which, in turn, reduce the effectiveness of the system being used as a real time system. This paper introduces a novel parameterized technique for iris segmentation. The method is based on a number of steps starting from converting grayscale eye image to a bit plane representation, selection of the most significant bit planes followed by a parameterization of the iris location resulting in an accurate segmentation of the iris from the original image. A lossless Hexadata encoding method is then applied to the data, which is based on reducing each set of six data items to a single encoded value. The tested results achieved acceptable saving bytes performance for the 21 iris square images of sizes 256x256 pixels which is about 22.4 KB on average with 0.79 sec decompression average time, with high saving bytes performance for 2 iris non-square images of sizes 640x480/2048x1536 that reached 76KB/2.2 sec, 1630 KB/4.71 sec respectively, Finally, the proposed promising techniques standard lossless JPEG2000 compression techniques with reduction about 1.2 and more in KB saving that implicitly demonstrating the power and efficiency of the suggested lossless biometric techniques.
Realizing the full potential of wireless sensor networks (WSNs) highlights many design issues, particularly the trade-offs concerning multiple conflicting improvements such as maximizing the route overlapping for efficient data aggregation and minimizing the total link cost. While the issues of data aggregation routing protocols and link cost function in a WSNs have been comprehensively considered in the literature, a trade-off improvement between these two has not yet been addressed. In this paper, a comprehensive weight for trade-off between different objectives has been employed, the so-called weighted data aggregation routing strategy (WDARS) which aims to maximize the overlap routes for efficient data aggregation and link cost
... Show MoreMixed-effects conditional logistic regression is evidently more effective in the study of qualitative differences in longitudinal pollution data as well as their implications on heterogeneous subgroups. This study seeks that conditional logistic regression is a robust evaluation method for environmental studies, thru the analysis of environment pollution as a function of oil production and environmental factors. Consequently, it has been established theoretically that the primary objective of model selection in this research is to identify the candidate model that is optimal for the conditional design. The candidate model should achieve generalizability, goodness-of-fit, parsimony and establish equilibrium between bias and variab
... Show MoreThe data presented in this paper are related to the research article entitled “Novel dichloro(bis{2-[1-(4-methylphenyl)-1H-1,2,3-triazol-4-yl-κN3 ]pyridine-κN})metal(II) coordination compounds of seven transition metals (Mn, Fe, Co, Ni, Cu, Zn and Cd)” (Conradie et al., 2018) [1]. This paper presents characterization and structural data of the 2-(1-(4-methyl-phenyl)-1H-1,2,3-triazol-1-yl)pyridine ligand (L2 ) (Tawfiq et al., 2014) [2] as well as seven dichloro(bis{2- [1-(4-methylphenyl)-1H-1,2,3-triazol-4-yl-κN3 ]pyridine-κN})metal (II) coordination compounds, [M(L2 )2Cl2], all containing the same ligand but coordinated to different metal ions. The data illustrate the shift in IR, UV/VIS, and NMR (for diamagnetic complexes) peaks wh
... Show MoreThere is a great operational risk to control the day-to-day management in water treatment plants, so water companies are looking for solutions to predict how the treatment processes may be improved due to the increased pressure to remain competitive. This study focused on the mathematical modeling of water treatment processes with the primary motivation to provide tools that can be used to predict the performance of the treatment to enable better control of uncertainty and risk. This research included choosing the most important variables affecting quality standards using the correlation test. According to this test, it was found that the important parameters of raw water: Total Hardn
Compressing an image and reconstructing it without degrading its original quality is one of the challenges that still exist now a day. A coding system that considers both quality and compression rate is implemented in this work. The implemented system applies a high synthetic entropy coding schema to store the compressed image at the smallest size as possible without affecting its original quality. This coding schema is applied with two transform-based techniques, one with Discrete Cosine Transform and the other with Discrete Wavelet Transform. The implemented system was tested with different standard color images and the obtained results with different evaluation metrics have been shown. A comparison was made with some previous rel
... Show MoreFractal image compression gives some desirable properties like fast decoding image, and very good rate-distortion curves, but suffers from a high encoding time. In fractal image compression a partitioning of the image into ranges is required. In this work, we introduced good partitioning process by means of merge approach, since some ranges are connected to the others. This paper presents a method to reduce the encoding time of this technique by reducing the number of range blocks based on the computing the statistical measures between them . Experimental results on standard images show that the proposed method yields minimize (decrease) the encoding time and remain the quality results passable visually.
The presence of different noise sources and continuous increase in crosstalk in the deep submicrometer technology raised concerns for on-chip communication reliability, leading to the incorporation of crosstalk avoidance techniques in error control coding schemes. This brief proposes joint crosstalk avoidance with adaptive error control scheme to reduce the power consumption by providing appropriate communication resiliency based on runtime noise level. By switching between shielding and duplication as the crosstalk avoidance technique and between hybrid automatic repeat request and forward error correction as the error control policies, three modes of error resiliencies are provided. The results show that, in reduced mode, the scheme achie
... Show MoreOne of the most significant elements influencing weather, climate, and the environment is vegetation cover. Normalized Difference Vegetation Index (NDVI) and Normalized Difference Built-up Index (NDBI) over the years 2019–2022 were estimated based on four Landsat 8 TIRS’s images covering Duhok City. Using the radiative transfer model, the city's land surface temperature (LST) during the next four years was calculated. The aim of this study is to compute the temperature at the land's surface (LST) from the years 2019-2022 and understand the link, between LST, NDVI, and NDBI and the capability for mapping by LANDSAT-8 TIRS's. The findings revealed that the NDBI and the NDVI had the strongest correlation with the
... Show MoreIn many oil-recovery systems, relative permeabilities (kr) are essential flow factors that affect fluid dispersion and output from petroleum resources. Traditionally, taking rock samples from the reservoir and performing suitable laboratory studies is required to get these crucial reservoir properties. Despite the fact that kr is a function of fluid saturation, it is now well established that pore shape and distribution, absolute permeability, wettability, interfacial tension (IFT), and saturation history all influence kr values. These rock/fluid characteristics vary greatly from one reservoir region to the next, and it would be impossible to make kr measurements in all of them. The unsteady-state approach was used to calculate the relat
... Show More