With its rapid spread, the coronavirus infection shocked the world and had a huge effect on billions of peoples' lives. The problem is to find a safe method to diagnose the infections with fewer casualties. It has been shown that X-Ray images are an important method for the identification, quantification, and monitoring of diseases. Deep learning algorithms can be utilized to help analyze potentially huge numbers of X-Ray examinations. This research conducted a retrospective multi-test analysis system to detect suspicious COVID-19 performance, and use of chest X-Ray features to assess the progress of the illness in each patient, resulting in a "corona score." where the results were satisfactory compared to the benchmarked techniques. This research results showed that rapidly evolved Artificial Intelligence (AI) -based image analysis can accomplish high accuracy in detecting coronavirus infection as well as quantification and illness burden monitoring.
In this study some generic commercial products of Atorvastatin tablets were evaluated by dissolution test in acid medium by comparing with that of parent drug Lipitor of Pfizer Company. Some of solubilizing agents were studied in formulation of Atorvastatin tablet including; surface active agent and PEG 6000 .The most effective factor was the use of PEG6000 in formulation of Atorvastatin tablet which improved the dissolution and the results of dissolution profile of formulated tablet in this work was bioequivalent to that of Lipitor .The quantitative analysis of this work was performed by using reversed phase liquid chromatography and a proper mixture of  
... Show MoreSimulation experiments are a means of solving in many fields, and it is the process of designing a model of the real system in order to follow it and identify its behavior through certain models and formulas written according to a repeating software style with a number of iterations. The aim of this study is to build a model that deals with the behavior suffering from the state of (heteroskedasticity) by studying the models (APGARCH & NAGARCH) using (Gaussian) and (Non-Gaussian) distributions for different sample sizes (500,1000,1500,2000) through the stage of time series analysis (identification , estimation, diagnostic checking and prediction). The data was generated using the estimations of the parameters resulting f
... Show MorePavement crack and pothole identification are important tasks in transportation maintenance and road safety. This study offers a novel technique for automatic asphalt pavement crack and pothole detection which is based on image processing. Different types of cracks (transverse, longitudinal, alligator-type, and potholes) can be identified with such techniques. The goal of this research is to evaluate road surface damage by extracting cracks and potholes, categorizing them from images and videos, and comparing the manual and the automated methods. The proposed method was tested on 50 images. The results obtained from image processing showed that the proposed method can detect cracks and potholes and identify their severity levels wit
... Show MoreDifferent solvents (light naphtha, n-heptane, and n-hexane) are used to treat Iraqi Atmospheric oil residue by the deasphalting process. Oil residue from Al-Dura refinery with specific gravity 0.9705, API 14.9, and 0.5 wt. % sulfur content was used. Deasphalting oil (DAO) was examined on a laboratory scale by using solvents with different operation conditions (temperature, concentration of solvent, solvent to oil ratio, and duration time). This study investigates the effects of these parameters on asphaltene yield. The results show that an increase in temperature for all solvents increases the extraction of asphaltene yield. The higher reduction in asphaltene content is obtained with hexane solvent at operating conditions of (90 °C, 4/1
... Show MoreThe adsorption isotherms and kinetic uptakes of Carbon Dioxide (CO2) on fabricated electrospun nonwoven activated carbon nanofiber sheets were investigated at two different temperatures, 308 K and 343 K, over a pressure range of 1 to 7 bar. The activated carbon nanofiber-based on polymer (PAN) precursor was fabricated via electrospinning technique followed by thermal treatment to obtain the carboneous nanofibers. The obtained data of CO2 adsorption isotherm was fitted to various models, including Langmuir, Freundlich, and Temkin. Based on correlation coefficients, the Langmuir isotherm model presented the best fitting with CO2 adsorption isotherms’ experimental data. Raising the equ
In this paper, we deal with games of fuzzy payoffs problem while there is uncertainty in data. We use the trapezoidal membership function to transform the data into fuzzy numbers and utilize the three different ranking function algorithms. Then we compare between these three ranking algorithms by using trapezoidal fuzzy numbers for the decision maker to get the best gains
A digital elevation model (DEM) is a digital representation of ground surface topography or terrain. It can be represented as a raster (a grid of squares) and it is commonly estimated by utilizing remote sensing techniques, or from land surveying. In this research a 3D building of Baghdad university campus have been performed using DEM, where the easting, northing, and elevation of 400 locations have been obtained by field survey using global positioning system (GPS). The image of the investigated area has been extracted from QuickBird satellite sensor (with spatial resolution of 0.6 m). This image has been geo-referenced by selecting ground control points of the GPS. The rectification is running, using 1st order polynomial transformation.
... Show MoreGroupwise non-rigid image alignment is a difficult non-linear optimization problem involving many parameters and often large datasets. Previous methods have explored various metrics and optimization strategies. Good results have been previously achieved with simple metrics, requiring complex optimization, often with many unintuitive parameters that require careful tuning for each dataset. In this chapter, the problem is restructured to use a simpler, iterative optimization algorithm, with very few free parameters. The warps are refined using an iterative Levenberg-Marquardt minimization to the mean, based on updating the locations of a small number of points and incorporating a stiffness constraint. This optimization approach is eff
... Show MorePermeability data has major importance work that should be handled in all reservoir simulation studies. The importance of permeability data increases in mature oil and gas fields due to its sensitivity for the requirements of some specific improved recoveries. However, the industry has a huge source of data of air permeability measurements against little number of liquid permeability values. This is due to the relatively high cost of special core analysis.
The current study suggests a correlation to convert air permeability data that are conventionally measured during laboratory core analysis into liquid permeability. This correlation introduces a feasible estimation in cases of data loose and poorly consolidated formations, or in cas
Introduction: Although soap industry is known from hundreds of years, the development accompanied with this industry was little. The development implied the mechanical equipment and the additive materials necessary to produce soap with the best specifications of shape, physical and chemical properties. Objectives: This research studies the use of vacuum reactive distillation VRD technique for soap production. Methods: Olein and Palmitin in the ratio of 3 to 1 were mixed in a flask with NaOH solution in stoichiometric amount under different vacuum pressures from -0.35 to -0.5 bar. Total conversion was reached by using the VRD technique. The soap produced by the VRD method was compared with soap prepared by the reaction - only method which
... Show More