Steganography is a mean of hiding information within a more obvious form of
communication. It exploits the use of host data to hide a piece of information in such a way
that it is imperceptible to human observer. The major goals of effective Steganography are
High Embedding Capacity, Imperceptibility and Robustness. This paper introduces a scheme
for hiding secret images that could be as much as 25% of the host image data. The proposed
algorithm uses orthogonal discrete cosine transform for host image. A scaling factor (a) in
frequency domain controls the quality of the stego images. Experimented results of secret
image recovery after applying JPEG coding to the stego-images are included.
Perchloroethylene (PERC) is commonly used as a dry-cleaning solvent, it is attributed to many deleterious effects in the biological system. The study aimed to investigate the harmful effect associated with PERC exposure among dry-cleaning workers. The study was carried out on 58 adults in two groups. PERC-exposed group; include thirty-two male dry-cleaning workers using PERC as a dry-cleaning solvent and twenty-six healthy non-exposed subjects. History of PERC exposure, use of personal protection equipment (PPE), safety measurement of the exposed group was recorded. Blood sample was taken from each participant for measurement of hematological markers, liver and kidney function tests. The results showed that 28.1% of the workers were usin
... Show MoreData centric techniques, like data aggregation via modified algorithm based on fuzzy clustering algorithm with voronoi diagram which is called modified Voronoi Fuzzy Clustering Algorithm (VFCA) is presented in this paper. In the modified algorithm, the sensed area divided into number of voronoi cells by applying voronoi diagram, these cells are clustered by a fuzzy C-means method (FCM) to reduce the transmission distance. Then an appropriate cluster head (CH) for each cluster is elected. Three parameters are used for this election process, the energy, distance between CH and its neighbor sensors and packet loss values. Furthermore, data aggregation is employed in each CH to reduce the amount of data transmission which le
... Show MoreQuality control is an effective statistical tool in the field of controlling the productivity to monitor and confirm the manufactured products to the standard qualities and the certified criteria for some products and services and its main purpose is to cope with the production and industrial development in the business and competitive market. Quality control charts are used to monitor the qualitative properties of the production procedures in addition to detecting the abnormal deviations in the production procedure. The multivariate Kernel Density Estimator control charts method was used which is one of the nonparametric methods that doesn’t require any assumptions regarding the distribution o
... Show MoreSpraying pesticides is one of the most common procedures that is conducted to control pests. However, excessive use of these chemicals inversely affects the surrounding environments including the soil, plants, animals, and the operator itself. Therefore, researchers have been encouraged to...
In this research two algorithms are applied, the first is Fuzzy C Means (FCM) algorithm and the second is hard K means (HKM) algorithm to know which of them is better than the others these two algorithms are applied on a set of data collected from the Ministry of Planning on the water turbidity of five areas in Baghdad to know which of these areas are less turbid in clear water to see which months during the year are less turbid in clear water in the specified area.
In the present study twenty samples of human urine were taken
from healthy male and female with different of: ages, occupation and
place of residence. These samples were collected from the hospital to
measure the concentration of radon gas in human urine by using one
of solid state nuclear track detectors LR-115.
The results obtained of the concentrations of radon in healthy human
urine are varying from 2.12×10-3 Bq.l-1 to 4.42×10-3 Bq.l-1 and
these values are less than the allowed limits 12.3×10-3 Bq.l-1.
The statistical distributions study aimed to obtain on best descriptions of variable sets phenomena, which each of them got one behavior of that distributions . The estimation operations study for that distributions considered of important things which could n't canceled in variable behavior study, as result this research came as trial for reaching to best method for information distribution estimation which is generalized linear failure rate distribution, throughout studying the theoretical sides by depending on statistical posteriori methods like greatest ability, minimum squares method and Mixing method (suggested method).
The research
... Show More
It is considered as one of the statistical methods used to describe and estimate the relationship between randomness (Y) and explanatory variables (X). The second is the homogeneity of the variance, in which the dependent variable is a binary response takes two values (One when a specific event occurred and zero when that event did not happen) such as (injured and uninjured, married and unmarried) and that a large number of explanatory variables led to the emergence of the problem of linear multiplicity that makes the estimates inaccurate, and the method of greatest possibility and the method of declination of the letter was used in estimating A double-response logistic regression model by adopting the Jackna
... Show More