In the present work, a set of indoor Radon concentration measurements was carried out in a number of rooms and buildings of Science College in the University of Mustansiriyah for the first time in Iraq using RAD-7 detector which is an active method for short time measuring compared with the passive method in solid state nuclear track detectors (SSNTD's). The results show that, the Radon concentrations values vary from 9.85±1.7 Bq.m-3 to 94.21±34.7 Bq.m-3 with an average value 53.64±26 Bq.m-3 which is lower than the recommended action level 200-300 Bq/m3 [ICRP, 2009].
The values of the annual effective dose (A.E.D) vary from 0.25 mSv/y to 2.38 mSv/y, with an average value 1.46±0.67 mSv/y which is lower than the recommended the range 3-10 mSv/y [ICRP, 1993]. While the values of lung cancer cases per year per million person vary from 4.50 per million person to 42.84 per million person with an average value 24.35±12 per million person which is lower than the recommended range 170-230 per million person [ICRP, 1993].
The values of the potential alpha energy concentration were found to vary from 10.18 mWL to 1.06 mWL, with an average value 5.79±2.8 mWL which is lower than the recommended value of 53.33 mWL given by [UNSCEAR, 1993].
Amplitude variation with offset (AVO) analysis is an 1 efficient tool for hydrocarbon detection and identification of elastic rock properties and fluid types. It has been applied in the present study using reprocessed pre-stack 2D seismic data (1992, Caulerpa) from north-west of the Bonaparte Basin, Australia. The AVO response along the 2D pre-stack seismic data in the Laminaria High NW shelf of Australia was also investigated. Three hypotheses were suggested to investigate the AVO behaviour of the amplitude anomalies in which three different factors; fluid substitution, porosity and thickness (Wedge model) were tested. The AVO models with the synthetic gathers were analysed using log information to find which of these is the
... Show More This study includes Estimating scale parameter, location parameter and reliability function for Extreme Value (EXV) distribution by two methods, namely: -
- Maximum Likelihood Method (MLE).
- Probability Weighted Moments Method (PWM).
Used simulations to generate the required samples to estimate the parameters and reliability function of different sizes(n=10,25,50,100) , and give real values for the parameters are and , replicate the simulation experiments (RP=1000)
... Show MoreIn this work, using GPS which has best accuracy that can be established set of GCPs, also two satellite images can be used, first with high resolution QuickBird, and second has low resolution Landsat image and topographic maps with 1:100,000 and 1:250,000 scales. The implementing of these factors (GPS, two satellite images, different scales for topographic maps, and set of GCPs) can be applying. In this study, must be divided this work into two parts geometric accuracy and informative accuracy investigation. The first part is showing geometric correction for two satellite images and maps.
The second part of the results is to demonstrate the features (how the features appearance) of topographic map or pictorial map (image map), Where i
Steganography is defined as hiding confidential information in some other chosen media without leaving any clear evidence of changing the media's features. Most traditional hiding methods hide the message directly in the covered media like (text, image, audio, and video). Some hiding techniques leave a negative effect on the cover image, so sometimes the change in the carrier medium can be detected by human and machine. The purpose of suggesting hiding information is to make this change undetectable. The current research focuses on using complex method to prevent the detection of hiding information by human and machine based on spiral search method, the Structural Similarity Index Metrics measures are used to get the accuracy and quality
... Show More
It is considered as one of the statistical methods used to describe and estimate the relationship between randomness (Y) and explanatory variables (X). The second is the homogeneity of the variance, in which the dependent variable is a binary response takes two values (One when a specific event occurred and zero when that event did not happen) such as (injured and uninjured, married and unmarried) and that a large number of explanatory variables led to the emergence of the problem of linear multiplicity that makes the estimates inaccurate, and the method of greatest possibility and the method of declination of the letter was used in estimating A double-response logistic regression model by adopting the Jackna
... Show MoreThe support vector machine, also known as SVM, is a type of supervised learning model that can be used for classification or regression depending on the datasets. SVM is used to classify data points by determining the best hyperplane between two or more groups. Working with enormous datasets, on the other hand, might result in a variety of issues, including inefficient accuracy and time-consuming. SVM was updated in this research by applying some non-linear kernel transformations, which are: linear, polynomial, radial basis, and multi-layer kernels. The non-linear SVM classification model was illustrated and summarized in an algorithm using kernel tricks. The proposed method was examined using three simulation datasets with different sample
... Show MoreThe growing use of tele
This paper presents a new secret diffusion scheme called Round Key Permutation (RKP) based on the nonlinear, dynamic and pseudorandom permutation for encrypting images by block, since images are considered particular data because of their size and their information, which are two-dimensional nature and characterized by high redundancy and strong correlation. Firstly, the permutation table is calculated according to the master key and sub-keys. Secondly, scrambling pixels for each block to be encrypted will be done according the permutation table. Thereafter the AES encryption algorithm is used in the proposed cryptosystem by replacing the linear permutation of ShiftRows step with the nonlinear and secret pe
... Show MoreWeb testing is very important method for users and developers because it gives the ability to detect errors in applications and check their quality to perform services to users performance abilities, user interface, security and other different types of web testing that may occur in web application. This paper focuses on a major branch of the performance testing, which is called the load testing. Load testing depends on an important elements called request time and response time. From these elements, it can be decided if the performance time of a web application is good or not. In the experimental results, the load testing applied on the website (http://ihcoedu.uobaghdad.edu.iq) the main home page and all the science departments pages. In t
... Show More