In this study, mean free path and positron elastic-inelastic scattering are modeled for the elements hydrogen (H), carbon (C), nitrogen (N), oxygen (O), phosphorus (P), sulfur (S), chlorine (Cl), potassium (K) and iodine (I). Despite the enormous amounts of data required, the Monte Carlo (MC) method was applied, allowing for a very accurate simulation of positron interaction collisions in live cells. Here, the MC simulation of the interaction of positrons was reported with breast, liver, and thyroid at normal incidence angles, with energies ranging from 45 eV to 0.2 MeV. The model provides a straightforward analytic formula for the random sampling of positron scattering. ICRU44 was used to compile the elemental composition data. In this
... Show MoreNatural dye sensitized solar cell was prepared using strawberry and pomegranate dyes with anatase nanocrystalline titanium dioxide powder. A study of the optical properties of the two dyes, involving the absorption spectrum was determined in the visible region. I-V characteristics under illumination were performed. The results showed that the two prepared dye sensitized solar cells have acceptable values efficiency about (0.94 with Fill factor (45)) and (0.74 with Fill factor (44)) for strawberry and pomegranate dyes, respectively.
Steganography is defined as hiding confidential information in some other chosen media without leaving any clear evidence of changing the media's features. Most traditional hiding methods hide the message directly in the covered media like (text, image, audio, and video). Some hiding techniques leave a negative effect on the cover image, so sometimes the change in the carrier medium can be detected by human and machine. The purpose of suggesting hiding information is to make this change undetectable. The current research focuses on using complex method to prevent the detection of hiding information by human and machine based on spiral search method, the Structural Similarity Index Metrics measures are used to get the accuracy and quality
... Show More
It is considered as one of the statistical methods used to describe and estimate the relationship between randomness (Y) and explanatory variables (X). The second is the homogeneity of the variance, in which the dependent variable is a binary response takes two values (One when a specific event occurred and zero when that event did not happen) such as (injured and uninjured, married and unmarried) and that a large number of explanatory variables led to the emergence of the problem of linear multiplicity that makes the estimates inaccurate, and the method of greatest possibility and the method of declination of the letter was used in estimating A double-response logistic regression model by adopting the Jackna
... Show MoreRadio observations from astronomical sources like supernovae became one the most important sources of information about the physical properties of those objects. However, such radio observations are affected by various types of noise such as those from sky, background, receiver, and the system itself. Therefore, it is essential to eliminate or reduce these undesired noise from the signals in order to ensure accurate measurements and analysis of radio observations. One of the most commonly used methods for reducing the noise is to use a noise calibrator. In this study, the 3-m Baghdad University Radio Telescope (BURT) has been used to observe crab nebula with and without using a calibration unit in order to investigate its impact on the sign
... Show MoreWeb testing is very important method for users and developers because it gives the ability to detect errors in applications and check their quality to perform services to users performance abilities, user interface, security and other different types of web testing that may occur in web application. This paper focuses on a major branch of the performance testing, which is called the load testing. Load testing depends on an important elements called request time and response time. From these elements, it can be decided if the performance time of a web application is good or not. In the experimental results, the load testing applied on the website (http://ihcoedu.uobaghdad.edu.iq) the main home page and all the science departments pages. In t
... Show More
It is considered as one of the statistical methods used to describe and estimate the relationship between randomness (Y) and explanatory variables (X). The second is the homogeneity of the variance, in which the dependent variable is a binary response takes two values (One when a specific event occurred and zero when that event did not happen) such as (injured and uninjured, married and unmarried) and that a large number of explanatory variables led to the emergence of the problem of linear multiplicity that makes the estimates inaccurate, and the method of greatest possibility and the method of declination of the letter was used in estimating A double-response logistic regression model by adopting the Jackna
... Show MoreWith its rapid spread, the coronavirus infection shocked the world and had a huge effect on billions of peoples' lives. The problem is to find a safe method to diagnose the infections with fewer casualties. It has been shown that X-Ray images are an important method for the identification, quantification, and monitoring of diseases. Deep learning algorithms can be utilized to help analyze potentially huge numbers of X-Ray examinations. This research conducted a retrospective multi-test analysis system to detect suspicious COVID-19 performance, and use of chest X-Ray features to assess the progress of the illness in each patient, resulting in a "corona score." where the results were satisfactory compared to the benchmarked techniques. T
... Show More
XML is being incorporated into the foundation of E-business data applications. This paper addresses the problem of the freeform information that stored in any organization and how XML with using this new approach will make the operation of the search very efficient and time consuming. This paper introduces new solution and methodology that has been developed to capture and manage such unstructured freeform information (multi information) depending on the use of XML schema technologies, neural network idea and object oriented relational database, in order to provide a practical solution for efficiently management multi freeform information system.