The crude enzyme Nattokinase produced by Bacillus subtilis was used in ripening cheddar cheese by adding three concentration of enzyme 80, 160 and 320mg/Kg beside the control treatment without enzyme, the product was checked for three months to determine humidity, protein, fat, non-protein nitrogen, soluble nitrogen and pH, sensory evaluation was conducted, it was noticed that the variety in protein percentages and the soluble nitrogen percentage during second month of ripening for T2, T3 and T4 treatments were (11.2, 15.54 and 18.48) respectively, in comparison with control which was 7.6%, while in the third month it was (17.37, 20.67 and 22.26) respectively, in comparison with control which was only 10%, on the other hand, non-protein nitrogen (NPN) increased to (2.39, 3.35 and 5.37) respectively, in comparison with control which was 1.48, it is possible to achieve acceptable cheddar cheese after 2-3 months which means reducing ripening period.
Digital image manipulation has become increasingly prevalent due to the widespread availability of sophisticated image editing tools. In copy-move forgery, a portion of an image is copied and pasted into another area within the same image. The proposed methodology begins with extracting the image's Local Binary Pattern (LBP) algorithm features. Two main statistical functions, Stander Deviation (STD) and Angler Second Moment (ASM), are computed for each LBP feature, capturing additional statistical information about the local textures. Next, a multi-level LBP feature selection is applied to select the most relevant features. This process involves performing LBP computation at multiple scales or levels, capturing textures at different
... Show MoreIn Computer-based applications, there is a need for simple, low-cost devices for user authentication. Biometric authentication methods namely keystroke dynamics are being increasingly used to strengthen the commonly knowledge based method (example a password) effectively and cheaply for many types of applications. Due to the semi-independent nature of the typing behavior it is difficult to masquerade, making it useful as a biometric. In this paper, C4.5 approach is used to classify user as authenticated user or impostor by combining unigraph features (namely Dwell time (DT) and flight time (FT)) and digraph features (namely Up-Up Time (UUT) and Down-Down Time (DDT)). The results show that DT enhances the performance of digraph features by i
... Show MoreA strong sign language recognition system can break down the barriers that separate hearing and speaking members of society from speechless members. A novel fast recognition system with low computational cost for digital American Sign Language (ASL) is introduced in this research. Different image processing techniques are used to optimize and extract the shape of the hand fingers in each sign. The feature extraction stage includes a determination of the optimal threshold based on statistical bases and then recognizing the gap area in the zero sign and calculating the heights of each finger in the other digits. The classification stage depends on the gap area in the zero signs and the number of opened fingers in the other signs as well as
... Show MoreIn order to take measures in controlling soil erosion it is required to estimate soil loss over area of interest. Soil loss due to soil erosion can be estimated using predictive models such as Universal Soil Loss Equation (USLE). The accuracy of these models depends on parameters that are used in equations. One of the most important parameters in equations used in both of models is (C) factor that represents effects of vegetation and other land covers. Estimating land cover by interpretation of remote sensing imagery involves Normalized Difference Vegetation Index (NDVI), an indicator that shows vegetation cover. The aim of this study is estimate (C) factor values for Part of Baghdad city using NDVI derived from satellite Image of Landsat-7
... Show MoreIn this study, experimental and numerical applied of heat distribution due to pulsed Nd: YAG laser surface melting. Experimental side was consists of laser parameters are, pulse duration1.3
Through the last decade, Integrated Project Delivery (IPD) methodology considers one of the new contractual relations that are also on the way to further integrate the process of combining design and instruction. On the other hand, Building Information Modeling (BIM) made significant advancements in coordinating the planning and construction processes. It is being used more often in conjunction with traditional delivery methods. In this paper, the researcher will present the achievement of IPD methodology by using BIM through applying on the design of the financial commission building in Mayssan Oil Company in Iraq. The building has not been constructed yet and it was designed by usin
The lethality of inorganic arsenic (As) and the threat it poses have made the development of efficient As detection systems a vital necessity. This research work demonstrates a sensing layer made of hydrous ferric oxide (Fe2H2O4) to detect As(III) and As(V) ions in a surface plasmon resonance system. The sensor conceptualizes on the strength of Fe2H2O4 to absorb As ions and the interaction of plasmon resonance towards the changes occurring on the sensing layer. Detection sensitivity values for As(III) and As(V) were 1.083 °·ppb−1 and 0.922 °·ppb
This paper is dealing with non-polynomial spline functions "generalized spline" to find the approximate solution of linear Volterra integro-differential equations of the second kind and extension of this work to solve system of linear Volterra integro-differential equations. The performance of generalized spline functions are illustrated in test examples
The Dirichlet process is an important fundamental object in nonparametric Bayesian modelling, applied to a wide range of problems in machine learning, statistics, and bioinformatics, among other fields. This flexible stochastic process models rich data structures with unknown or evolving number of clusters. It is a valuable tool for encoding the true complexity of real-world data in computer models. Our results show that the Dirichlet process improves, both in distribution density and in signal-to-noise ratio, with larger sample size; achieves slow decay rate to its base distribution; has improved convergence and stability; and thrives with a Gaussian base distribution, which is much better than the Gamma distribution. The performance depen
... Show More