The principal forms of radiation dosage for humans from spontaneous radiation material are being recognized as radon and its progenitors in the interior environment. Radiation-related health risks are caused by radon in water supply, which can be inhaled or ingested. Materials and Methods: The solid-state CR-39 nuclear trace detectors method was using in this research for measuring accumulation of radioactivity in water supply in different locations of Iraq's southwest corner of Baghdad. In Baghdad district, 42 samples were selected from 14 regions (3 samples out of each region) and put in dosimeters for 50 days. Results: The mean radon concentration was 49.75 Bq/m3, that is lower than the internationally recognized limit of 1100 Bq /m3. The total absorbed dose in micro sieverts each year (mSv/y) and concentration about alpha energy has be estimated. Within the area under study, the linear relation between annual effective dose in (mSv/y) and radon concentration has been established. Conclusion: According on the, findings radon concentrations in drinking water supplies are below than EPA's and WHO's recommended levels.
In this paper, previous studies about Fuzzy regression had been presented. The fuzzy regression is a generalization of the traditional regression model that formulates a fuzzy environment's relationship to independent and dependent variables. All this can be introduced by non-parametric model, as well as a semi-parametric model. Moreover, results obtained from the previous studies and their conclusions were put forward in this context. So, we suggest a novel method of estimation via new weights instead of the old weights and introduce
Paper Type: Review article.
another suggestion based on artificial neural networks.
Empirical equations for estimating thickening time and compressive strength of bentonitic - class "G" cement slurries were derived as a function of water to cement ratio and apparent viscosity (for any ratios). How the presence of such an equations easily extract the thickening time and compressive strength values of the oil field saves time without reference to the untreated control laboratory tests such as pressurized consistometer for thickening time test and Hydraulic Cement Mortars including water bath ( 24 hours ) for compressive strength test those may have more than one day.
This paper includes a comparison between denoising techniques by using statistical approach, principal component analysis with local pixel grouping (PCA-LPG), this procedure is iterated second time to further improve the denoising performance, and other enhancement filters were used. Like adaptive Wiener low pass-filter to a grayscale image that has been degraded by constant power additive noise, based on statistics estimated from a local neighborhood of each pixel. Performs Median filter of the input noisy image, each output pixel contains the Median value in the M-by-N neighborhood around the corresponding pixel in the input image, Gaussian low pass-filter and Order-statistic filter also be used. Experimental results shows LPG-PCA method
... Show MorePrediction of accurate values of residual entropy (SR) is necessary step for the
calculation of the entropy. In this paper, different equations of state were tested for the
available 2791 experimental data points of 20 pure superheated vapor compounds (14
pure nonpolar compounds + 6 pure polar compounds). The Average Absolute
Deviation (AAD) for SR of 2791 experimental data points of the all 20 pure
compounds (nonpolar and polar) when using equations of Lee-Kesler, Peng-
Robinson, Virial truncated to second and to third terms, and Soave-Redlich-Kwong
were 4.0591, 4.5849, 4.9686, 5.0350, and 4.3084 J/mol.K respectively. It was found
from these results that the Lee-Kesler equation was the best (more accurate) one
It is considered as one of the statistical methods used to describe and estimate the relationship between randomness (Y) and explanatory variables (X). The second is the homogeneity of the variance, in which the dependent variable is a binary response takes two values (One when a specific event occurred and zero when that event did not happen) such as (injured and uninjured, married and unmarried) and that a large number of explanatory variables led to the emergence of the problem of linear multiplicity that makes the estimates inaccurate, and the method of greatest possibility and the method of declination of the letter was used in estimating A double-response logistic regression model by adopting the Jackna
... Show MoreIn this research study the synodic month for the moon and theirrelationship with the mean anomaly for the moon orbit and date A.Dand for long periods of time (100 years), we was design a computerprogram that calculates the period of synodic months, and thecoordinates of the moon at the moment of the new moon with highaccuracy. During the 100 year, there are 1236 period of synodicmonths.We found that the when New Moon occurs near perigee (meananomaly = 0°), the length of the synodic month at a minimum.Similarly, when New Moon occurs near apogee (mean anomaly =180°), the length of the synodic month reaches a maximum. Theshortest synodic month on 2053 /1/ 16 and lasted (29.27436) days.The longest synodic month began on 2008 /11/ 27 a
... Show MoreAccountancy unit is looked is upon as unit that established for the purpose achieve it goals and programmers for unlimited time. Unless otherwise take place such as liquation whether voluntary or mandatory. Thus going concern logic is considered to be the logical foundation witch the familiar accounting principles are based upon. The future of a Company real its financial statues and position and the extent of it ability to face events in future. Hence the success and continuity its activities depend on the extent of the company activity to generate profits. And its ability to retain appropriate liquidity to serve its debts.
Therefore financial statements of the company consider to be on
... Show MoreThe Internet image retrieval is an interesting task that needs efforts from image processing and relationship structure analysis. In this paper, has been proposed compressed method when you need to send more than a photo via the internet based on image retrieval. First, face detection is implemented based on local binary patterns. The background is notice based on matching global self-similarities and compared it with the rest of the image backgrounds. The propose algorithm are link the gap between the present image indexing technology, developed in the pixel domain, and the fact that an increasing number of images stored on the computer are previously compressed by JPEG at the source. The similar images are found and send a few images inst
... Show More