In this research, the preparation of bidentate Schiff base was carried out via the condensation reaction of both the salicylaldehyde with 1-phenyl-2,3-dimethyl-4-amino-5-oxo-pyrazole to form the ligand (L). The mentioned ligand was used to prepare complexes with transition metal ions Mn(II), Co(II), Ni(II), Cu(II) and Zn(II). The resulting complexes were separated and characterized by FTIR and UV-Vis spectroscopic technique. Elemental analysis for Carbon, Hydrogen and Nitrogen elements, electronic spectra of the ligand and complexes were obtained, and the magnetic susceptibility tests were also achieved to measure the dipole moments. The molar conductivities were also measured and determination of chlorine content in the complexes and
... Show MoreThis study was carried out to measure the percentage of heavy metals pollution in the water of the Diyala river and to measure the percentage of contamination of these elements in the leafy vegetables grown on both sides of the Diyala river, which are irrigated by the contaminated river water (celery, radish, lepidium, green onions, beta vulgaris subsp, and malva). Laboratory analysis was achieved to measure the ratio of heavy element contamination (Pb, Fe, Ni, Cd, Zn and Cr) using flame atomic absorption spectrophotometer during the summer months of July and August for the year 2017. The study showed that the elements of zinc, chromium, nickel and cadmium were high concentrations and exceeded. The maximum concentration of these
... Show MoreA reliability system of the multi-component stress-strength model R(s,k) will be considered in the present paper ,when the stress and strength are independent and non-identically distribution have the Exponentiated Family Distribution(FED) with the unknown shape parameter α and known scale parameter λ equal to two and parameter θ equal to three. Different estimation methods of R(s,k) were introduced corresponding to Maximum likelihood and Shrinkage estimators. Comparisons among the suggested estimators were prepared depending on simulation established on mean squared error (MSE) criteria.
Background: Tooth wear is one of the most common problems in the older dentate population which results from the interaction of three processes (attrition, abrasion and erosion) and it affects all societies, different age groups, and all cultures. This study was achieved to evaluate the prevalence and distribution of tooth wear among institutionalized residents in Baghdad city\ Iraq. Subjects and Methods: This survey was accomplished on four private and one governmental institution in Baghdad city. One-hundred twenty three (61 males, 62 females) aged 50-89 years were participated in this study. The diagnosis and recording of tooth wear were according to criteria of Smith and Knight. Results: The prevalence of tooth wear was 100% with a mean
... Show MorePreparation of identical independent photons is the core of many quantum applications such as entanglement swapping and entangling process. In this work, Hong-Ou-Mandel experiment was performed to evaluate the degree of indistinguishability between independent photons generated from two independent weak coherent sources working at 640 nm. The visibility was 46%, close to the theoretical limit of 50%. The implemented setup can be adopted in quantum key distribution experiments carried out with free space as the channel link, as all the devices and components used are operative in the visible range of the electromagnetic spectrum.
Background: Sialosis described as a specific consequence of diabetes. In diabetic sialosis, the increased volume of the glands is due to the infiltration of adipose in the parenchyma. The B-scan ultrasonography is a generally accepted tool for determining parotid gland enlargement. Oral health is, to a greater extent, dependent on quality and quantity of saliva, both of which may be altered in diabetics. This study was established to detect the enlargement of parotid gland in diabetic patient and study the changes in physical properties of saliva and its relation with the salivary gland enlargement. Subjects, Materials and Methods: A cross-sectional study with highly specified criteria with ages ranged (20-65) years, male and female subject
... Show MoreThe problem of steady, laminar, natural convective flow in an square enclosure with and without partitions is considered for Rayleigh number (103-106) and Prandtl number (0.7). Vertical walls were maintained isothermal at different temperatures while horizontal walls and the partitions were insulated. The length of partition was taken constant. The number of partitions were placed on horizontal surface in staggered arrangement from (1– 3) and ratio of partition thickness (H/L= 0.033, 0.083, 0.124). The problem is formulated in terms of the vorticity-stream function procedure. A numerical solution based on a program in Fortran 90 with the finite difference method is obtained. Representative results illustrating the effects of the thickn
... Show MoreMost studies on deep beams have been made with reinforced concrete deep beams, only a few studies investigate the response of prestressed deep beams, while, to the best of our knowledge, there is not a study that investigates the response of full scale (T-section) prestressed deep beams with large web openings. An experimental and numerical study was conducted in order to investigate the shear strength of ordinary reinforced and partially prestressed full scale (T-section) deep beams that contain large web openings in order to investigate the prestressing existence effects on the deep beam responses and to better understand the effects of prestressing locations and opening depth to beam depth ratio on the deep beam performance and b
... Show MoreData scarcity is a major challenge when training deep learning (DL) models. DL demands a large amount of data to achieve exceptional performance. Unfortunately, many applications have small or inadequate data to train DL frameworks. Usually, manual labeling is needed to provide labeled data, which typically involves human annotators with a vast background of knowledge. This annotation process is costly, time-consuming, and error-prone. Usually, every DL framework is fed by a significant amount of labeled data to automatically learn representations. Ultimately, a larger amount of data would generate a better DL model and its performance is also application dependent. This issue is the main barrier for