Iris research is focused on developing techniques for identifying and locating relevant biometric features, accurate segmentation and efficient computation while lending themselves to compression methods. Most iris segmentation methods are based on complex modelling of traits and characteristics which, in turn, reduce the effectiveness of the system being used as a real time system. This paper introduces a novel parameterized technique for iris segmentation. The method is based on a number of steps starting from converting grayscale eye image to a bit plane representation, selection of the most significant bit planes followed by a parameterization of the iris location resulting in an accurate segmentation of the iris from the original image. A lossless Hexadata encoding method is then applied to the data, which is based on reducing each set of six data items to a single encoded value. The tested results achieved acceptable saving bytes performance for the 21 iris square images of sizes 256x256 pixels which is about 22.4 KB on average with 0.79 sec decompression average time, with high saving bytes performance for 2 iris non-square images of sizes 640x480/2048x1536 that reached 76KB/2.2 sec, 1630 KB/4.71 sec respectively, Finally, the proposed promising techniques standard lossless JPEG2000 compression techniques with reduction about 1.2 and more in KB saving that implicitly demonstrating the power and efficiency of the suggested lossless biometric techniques.
The oil and gas industry relies heavily on IT innovations to manage business processes, but the exponential generation of data has led to concerns about processing big data, generating valuable insights, and making timely decisions. Many companies have adopted Big Data Analytics (BDA) solutions to address these challenges. However, determining the adoption of BDA solutions requires a thorough understanding of the contextual factors influencing these decisions. This research explores these factors using a new Technology-Organisation-Environment (TOE) framework, presenting technological, organisational, and environmental factors. The study used a Delphi research method and seven heterogeneous panelists from an Oman oil and gas company
... Show MoreInformation about soil consolidation is essential in geotechnical design. Because of the time and expense involved in performing consolidation tests, equations are required to estimate compression index from soil index properties. Although many empirical equations concerning soil properties have been proposed, such equations may not be appropriate for local situations. The aim of this study is to investigate the consolidation and physical properties of the cohesive soil. Artificial Neural Network (ANN) has been adapted in this investigation to predict the compression index and compression ratio using basic index properties. One hundred and ninety five consolidation results for soils tested at different construction sites
... Show MoreA fast moving infrared excess source (G2) which is widely interpreted as a core-less gas and dust cloud approaches Sagittarius A* (Sgr A*) on a presumably elliptical orbit. VLT
In the last two decades, arid and semi-arid regions of China suffered rapid changes in the Land Use/Cover Change (LUCC) due to increasing demand on food, resulting from growing population. In the process of this study, we established the land use/cover classification in addition to remote sensing characteristics. This was done by analysis of the dynamics of (LUCC) in Zhengzhou area for the period 1988-2006. Interpretation of a laminar extraction technique was implied in the identification of typical attributes of land use/cover types. A prominent result of the study indicates a gradual development in urbanization giving a gradual reduction in crop field area, due to the progressive economy in Zhengzhou. The results also reflect degradati
... Show MoreThis paper aims at the analytical level to know the security topics that were used with data journalism, and the expression methods used in the statements of the Security Media Cell, as well as to identify the means of clarification used in data journalism. About the Security Media Cell, and the methods preferred by the public in presenting press releases, especially determining the strength of the respondents' attitude towards the data issued by the Security Media Cell. On the Security Media Cell, while the field study included the distribution of a questionnaire to the public of Baghdad Governorate. The study reached several results, the most important of which is the interest of the security media cell in presenting its data in differ
... Show MoreThe partial level density PLD of pre-equilibrium reactions that are described by Ericson’s formula has been studied using different formulae of single particle level density . The parameter was used from the equidistant spacing model (ESM) model and the non- equidistant spacing model (non-ESM) and another formula of are derived from the relation between and level density parameter . The formulae used to derive are the Roher formula, Egidy formula, Yukawa formula, and Thomas –Fermi formula. The partial level density results that depend on from the Thomas-Fermi formula show a good agreement with the experimental data.
The Estimation Of The Reliability Function Depends On The Accuracy Of The Data Used To Estimate The Parameters Of The Probability distribution, and Because Some Data Suffer from a Skew in their Data to Estimate the Parameters and Calculate the Reliability Function in light of the Presence of Some Skew in the Data, there must be a Distribution that has flexibility in dealing with that Data. As in the data of Diyala Company for Electrical Industries, as it was observed that there was a positive twisting in the data collected from the Power and Machinery Department, which required distribution that deals with those data and searches for methods that accommodate this problem and lead to accurate estimates of the reliability function,
... Show MoreThe investigation of machine learning techniques for addressing missing well-log data has garnered considerable interest recently, especially as the oil and gas sector pursues novel approaches to improve data interpretation and reservoir characterization. Conversely, for wells that have been in operation for several years, conventional measurement techniques frequently encounter challenges related to availability, including the lack of well-log data, cost considerations, and precision issues. This study's objective is to enhance reservoir characterization by automating well-log creation using machine-learning techniques. Among the methods are multi-resolution graph-based clustering and the similarity threshold method. By using cutti
... Show More