The crude enzyme Nattokinase produced by Bacillus subtilis was used in ripening cheddar cheese by adding three concentration of enzyme 80, 160 and 320mg/Kg beside the control treatment without enzyme, the product was checked for three months to determine humidity, protein, fat, non-protein nitrogen, soluble nitrogen and pH, sensory evaluation was conducted, it was noticed that the variety in protein percentages and the soluble nitrogen percentage during second month of ripening for T2, T3 and T4 treatments were (11.2, 15.54 and 18.48) respectively, in comparison with control which was 7.6%, while in the third month it was (17.37, 20.67 and 22.26) respectively, in comparison with control which was only 10%, on the other hand, non-protein
... Show MoreAbstract Ternary Silver Indium selenide Sulfur AgInSe1.8S0.2 in pure form and with a 0.2 ratio of Sulfur were fabricated via thermal evaporation under vacuum 3*10-6 torr on glasses substrates with a thickness of (550) nm. These films were investigated to understand their structural, optical, and Hall Characteristics. X-ray diffraction analysis was employed to examine the impact of varying Sulfur ratios on the structural properties. The results revealed that the AgInSe1.8S0.2 thin films in their pure form and with a 0.2 Sulfur ratio, both at room temperature and after annealing at 500 K, exhibited a polycrystalline nature with a tetragonal structure and a predominant orientation along the (112) plane, indicating an enhanced de
... Show MoreThe origin of this technique lies in the analysis of François Kenai (1694-1774), the leader of the School of Naturalists, presented in Tableau Economique. This method was developed by Karl Marx in his analysis of the Departmental Relationships and the nature of these relations in the models of " "He said. The current picture of this type of economic analysis is credited to the Russian economist Vasily Leontif. This analytical model is commonly used in developing economic plans in developing countries (p. 1, p. 86). There are several types of input and output models, such as static model, mobile model, regional models, and so on. However, this research will be confined to the open-ended model, which found areas in practical application.
... Show MoreShadow detection and removal is an important task when dealing with color outdoor images. Shadows are generated by a local and relative absence of light. Shadows are, first of all, a local decrease in the amount of light that reaches a surface. Secondly, they are a local change in the amount of light rejected by a surface toward the observer. Most shadow detection and segmentation methods are based on image analysis. However, some factors will affect the detection result due to the complexity of the circumstances. In this paper a method of segmentation test present to detect shadows from an image and a function concept is used to remove the shadow from an image.
Neural cryptography deals with the problem of “key exchange” between two neural networks by using the mutual learning concept. The two networks exchange their outputs (in bits) and the key between two communicating parties ar eventually represented in the final learned weights, when the two networks are said to be synchronized. Security of neural synchronization is put at risk if an attacker is capable of synchronizing with any of the two parties during the training process.
Diabetes is one of the increasing chronic diseases, affecting millions of people around the earth. Diabetes diagnosis, its prediction, proper cure, and management are compulsory. Machine learning-based prediction techniques for diabetes data analysis can help in the early detection and prediction of the disease and its consequences such as hypo/hyperglycemia. In this paper, we explored the diabetes dataset collected from the medical records of one thousand Iraqi patients. We applied three classifiers, the multilayer perceptron, the KNN and the Random Forest. We involved two experiments: the first experiment used all 12 features of the dataset. The Random Forest outperforms others with 98.8% accuracy. The second experiment used only five att
... Show MoreThe penalized least square method is a popular method to deal with high dimensional data ,where the number of explanatory variables is large than the sample size . The properties of penalized least square method are given high prediction accuracy and making estimation and variables selection
At once. The penalized least square method gives a sparse model ,that meaning a model with small variables so that can be interpreted easily .The penalized least square is not robust ,that means very sensitive to the presence of outlying observation , to deal with this problem, we can used a robust loss function to get the robust penalized least square method ,and get robust penalized estimator and
... Show MoreWater/oil emulsion is considered as the most refractory mixture to separate because of the interference of the two immiscible liquids, water and oil. This research presents a study of dewatering of water / kerosene emulsion using hydrocyclone. The effects of factors such as: feed flow rate (3, 5, 7, 9, and 11 L/min), inlet water concentration of the emulsion (5%, 7.5%, 10%, 12.5%, and 15% by volume), and split ratio (0.1, 0.3, 0.5, 0.7, and 0.9) on the separation efficiency and pressure drop were studied. Dimensional analysis using Pi theorem was applied for the first time to model the hydrocyclone based on the experimental data. It was shown that the maximum separation efficiency; at split ratio 0.1, was 94.3% at 10% co
... Show More