The aim of this study is to determine the flexural properties of steel fiber as a metal fiber and polyester resin as a matrix. The steel fibers were added to polyester resin at the various fiber volume fractions of 5, 10, and 15% steel fiber, and with different fiber orientations such as woven steel fiber type (0-45) ° and woven steel fiber type (0-90) ° indicate. Hand layup processes in these experiments were used to produce specimens test with the curing time of 24 hr. for the composite at room temperature. The results show that the flexural strength and flexural modulus values for 15 % vol.of woven steel fiber composite type (0-90) ° are (210MPa) and (2.29GPa( respectively. The results above indicate that the woven steel fiber (0-90) ° has a better bonding between its fiber and matrix compare to woven steel fiber type (0-45) °.
The study aims at showing the role of tax audit in Impact the quality of tax statements. Tax audit is one of the most important means used by tax management to identify taxable revenues in a just, fair manner. The quality of statements relies on the extent to which the information provided by taxpayers is true and accurate. Tax audit works is compatible with the strategy of increasing tax adherence and detecting non-adherence cases and penalizing those who commit such violations. The study reached a number of results and conclusions. One of the most important results is that tax audit helps improve the information content of the taxpayers tax statements. This leads to recalculating taxable incomes and re-fixing t
... Show MoreIn this paper, an algorithm through which we can embed more data than the
regular methods under spatial domain is introduced. We compressed the secret data
using Huffman coding and then this compressed data is embedded using laplacian
sharpening method.
We used Laplace filters to determine the effective hiding places, then based on
threshold value we found the places with the highest values acquired from these filters
for embedding the watermark. In this work our aim is increasing the capacity of
information which is to be embedded by using Huffman code and at the same time
increasing the security of the algorithm by hiding data in the places that have highest
values of edges and less noticeable.
The perform
The study was conducted on twenty dogs from variety breeds to estimate the
incidence of tumor mass and determine the risk factors of survey to cases of a year
in veterinary hospital in Baghdad. The most common clinical signs were, ulceration,
bleeding into lesions in addition to drowsiness, anorexia, fever and the others were
depended tumor's location in dog's body like lameness, lacrimation and bloody
constipation etc.
The results showed 70% of the infected dogs were working with military forces
and 30% of them were pet dogs and we found that the highest percentage of tumor
accrued in dogs aged more than 10 years and the females recorded 60% of infection.
Terrier breed had the highest percentage of infection (
Data hiding (Steganography) is a method used for data security purpose and to protect the data during its transmission. Steganography is used to hide the communication between two parties by embedding a secret message inside another cover (audio, text, image or video). In this paper a new text Steganography method is proposed that based on a parser and the ASCII of non-printed characters to hide the secret information in the English cover text after coding the secret message and compression it using modified Run Length Encoding method (RLE). The proposed method achieved a high capacity ratio for Steganography (five times more than the cover text length) when compared with other methods, and provides a 1.0 transparency by depending on som
... Show MoreBecause of vulnerable threats and attacks against database during transmission from sender to receiver, which is one of the most global security concerns of network users, a lightweight cryptosystem using Rivest Cipher 4 (RC4) algorithm is proposed. This cryptosystem maintains data privacy by performing encryption of data in cipher form and transfers it over the network and again performing decryption to original data. Hens, ciphers represent encapsulating system for database tables
The problem of noise in the Baghdad airport has been examined in this study; and noise measurement and survey studies have been carried out at four high noise level (operation, training and development, quality system, and information and technology) zones located in this region. Noise exposure is a common hazard to workforce in general although at varying degrees depending on the occupation, as many workers are exposed for long periods of time to potentially hazardous noise. A questionnaire was completed by 122 workers during this study in order to determine the physical, physiological, and psycho-social impacts of the noise on workers and to specify what kind of measurements have been taken both by the employers and workers for protection
... Show MoreThe problem of noise in the Baghdad airport has been examined in this study; and noise measurement and survey studies have been carried out at four high noise level (operation, training and development, quality system, and information and technology) zones located in this region. Noise exposure is a common hazard to workforce in general although at varying degrees depending on the occupation, as many workers are exposed for long periods of time to potentially hazardous noise. A questionnaire was completed by 122 workers during this study in order to determine the physical, physiological, and psycho-social impacts of the noise on workers and to specify what kind of measurements have been taken both by the employers and workers for protection
... Show MoreIn this paper, preliminary test Shrinkage estimator have been considered for estimating the shape parameter α of pareto distribution when the scale parameter equal to the smallest loss and when a prior estimate α0 of α is available as initial value from the past experiences or from quaintance cases. The proposed estimator is shown to have a smaller mean squared error in a region around α0 when comparison with usual and existing estimators.