Carbon Monoxide (CO) has a significant indirect effect on greenhouse gasses due to its ozone and carbon dioxide precursor, and its mechanism of degradation involving the hydroxyl radical (OH) which control the oxidizing ability of the tropospheric. To understand the effect of human activities on atmospheric composition, accurate estimates of the sources of atmospheric carbon monoxide (CO) are necessary. MOPITT (Measurements of Pollution in the Troposphere) is a NASA Terra satellite instrument designed to allow both Thermal-Infra-Red (TIR) and Near-Infra-Red (NIR) observations to be used to collect vertical CO profiles in the Troposphere via the concept of correlation spectroscopy. The objective of the current study is to analyze and map the monthly, seasonal and annual trend of CO concentration for year 2016 in Nineveh governorate using the retrieved CO Surface Mixing Ratio Day mode of level 3, version 7 dataset. The dataset was downloaded from the National Aeronautics and Space Administration (NASA) operated GIOVANNI portal. The results of dataset analysis in GIS software showed many sources of carbon monoxide in Nineveh Governorate, which change with months and seasons of the year. Generally, the observed CO concentration levels in the southern and western of the governorate were more than in the northern and eastern parts. The annually average CO ranges from (115.374 ppbv) to (132.452 ppbv). Also, CO emissions and concentrations were higher in winter (128.638-157.567 ppbv) than summer season (97.144-106.515 ppbv).
In this paper, two types of iron oxide nanomaterial (Fe3O4) and nanocomposite (T-Fe3O4) were created from the bio-waste mass of tangerine peel. These two materials were utilized for adsorption tests to remove cefixime (CFX) from an aqueous solution. Before the adsorption application, both adsorbents have been characterized by various characterizations such as XRD, FTIR, VSM, TEM, and FESEM. The mesoporous nano-crystalline structure of Fe3O4 and T-Fe3O4 nanocomposite with less than 100-nm diameter is confirmed. The adsorption of the obtained adsorbents was evaluated for CFX removal by adjusting several operation parameters to optimize the removal. The optimal conditions for CFX removal were found to be an initial concentration of 40 and 50 m
... Show MoreA design of a Fabry -Perot interferometer system was constructed
to determine the precise value of the wavelength which is required in spectml studies depending on varying medium pressure where the refractive index was a function of pressure at a constant distance between the two mirrors by using a Hc-Ne laser (632.8) tun as a coherent source .
The (fmee) (t) and the coefficient of finesses (F) and the visbility
of the fringes (V) has been calculated . Image processing \\•as used and its result can be relied on verifying 
... Show MoreThe service quality of any information-based system could be evaluated by the high-end user in such a way that the system developer or responsible intently might use these user experiences to improve, develop and benchmark their system. In this paper, questionnaire implemented to rate to what extent the academic admission system as a web site achieves performance. Data were collected from 21 users of the system; all of them are highly educated and have the experience of using the site. Quadrant and gap analysis were implemented to evaluate the weakness and strength of the data. The major data analyses were performed on the data collected in terms of its importance and satisfaction to the users. A number of statistical tools have been uti
... Show MoreAt the end of 2019, a new form of Coronavirus (later dubbed COVID-19) emerged in China and quickly spread to other regions of the globe. Despite the virus’s unique and unknown characteristics, it is a widely distributed infectious illness. Finding the geographical distribution of the virus transmission is therefore critical for epidemiologists and governments in order to respond to the illness epidemic rapidly and effectively. Understanding the dynamics of COVID-19’s spatial distribution can help to understand the pandemic’s scope and effects, as well as decision-making, planning, and community action aimed at preventing transmission. The main focus of this study is to investigate the geographic patterns of COVID-19 disseminat
... Show MoreThe analysis of the hyperlink structure of the web has led to significant improvements in web information retrieval. This survey study evaluates and analyzes relevant research publications on link analysis in web information retrieval utilizing diverse methods. These factors include the research year, the aims of the research article, the algorithms utilized to complete their study, and the findings received after using the algorithms. The findings revealed that Page Rank, Weighted Page Rank, and Weighted Page Content Rank are extensively employed by academics to properly analyze hyperlinks in web information retrieval. Finally, this paper analyzes the previous studies.
Many fuzzy clustering are based on within-cluster scatter with a compactness measure , but in this paper explaining new fuzzy clustering method which depend on within-cluster scatter with a compactness measure and between-cluster scatter with a separation measure called the fuzzy compactness and separation (FCS). The fuzzy linear discriminant analysis (FLDA) based on within-cluster scatter matrix and between-cluster scatter matrix . Then two fuzzy scattering matrices in the objective function assure the compactness between data elements and cluster centers .To test the optimal number of clusters using validation clustering method is discuss .After that an illustrate example are applied.
The aim of this study is to achieve the best distinguishing function of the variables which have common characteristics to distinguish between the groups in order to identify the situation of the governorates that suffer from the problem of deprivation. This allows the parties concerned and the regulatory authorities to intervene to take corrective measures. The main indicators of the deprivation index included (education, health, infrastructure, housing, protection) were based on 2010 data available in the Central Bureau of Statistics
In this paper, a new hybridization of supervised principal component analysis (SPCA) and stochastic gradient descent techniques is proposed, and called as SGD-SPCA, for real large datasets that have a small number of samples in high dimensional space. SGD-SPCA is proposed to become an important tool that can be used to diagnose and treat cancer accurately. When we have large datasets that require many parameters, SGD-SPCA is an excellent method, and it can easily update the parameters when a new observation shows up. Two cancer datasets are used, the first is for Leukemia and the second is for small round blue cell tumors. Also, simulation datasets are used to compare principal component analysis (PCA), SPCA, and SGD-SPCA. The results sh
... Show More