This investigation aims to study some properties of lightweight aggregate concrete reinforced by mono or hybrid fibers of different sizes and types. In this research, the considered lightweight aggregate was Light Expanded Clay Aggregate while the adopted fibers included hooked, straight, polypropylene, and glass. Eleven lightweight concrete mixes were considered, These mixes comprised of; one plain concrete mix (without fibers), two reinforced concrete mixtures of mono fiber (hooked or straight fibers), six reinforced concrete mixtures of double hybrid fibers, and two reinforced concrete mixtures of triple hybrid fibers. Hardened concrete properties were investigated in this study. G
In this work, a magnetic switch was prepared using two typesof ferrofluid materials, the pure ferrofluid and ferrofluid doped with copper nanoparticles (10 nm). The critical magnetic field (Hc) and the state of magnetic saturation (Hs) were studied using three types of laser sources. The main parameters of the magnetic switch measured using pure ferrofluid and He-Ne Laser source were Hc(0.5 mv, 0.4 G), Hs (8.5 mv, 3 G). For the ferrofluid doped with copper nanoparticles were Hc (1 mv, 4 G), Hs (15 mv, 9.6 G), Using green semiconductor laser for the Pure ferrofluid were Hc (0.5 mv, 0.3 G) Hs (15 mv, 2.9 G). While the ferrofluid doped with copper nanoparticles were Hc (0.5 mv, 1 G), Hs (12 mv, 2.8 G) and by using the violet semiconductor l
... Show MoreIn this study, the quality assurance of the linear accelerator available at the Baghdad Center for Radiation Therapy and Nuclear Medicine was verified using Star Track and Perspex. The study was established from August to December 2018. This study showed that there was an acceptable variation in the dose output of the linear accelerator. This variation was ±2% and it was within the permissible range according to the recommendations of the manufacturer of the accelerator (Elkta).
Often phenomena suffer from disturbances in their data as well as the difficulty of formulation, especially with a lack of clarity in the response, or the large number of essential differences plaguing the experimental units that have been taking this data from them. Thus emerged the need to include an estimation method implicit rating of these experimental units using the method of discrimination or create blocks for each item of these experimental units in the hope of controlling their responses and make it more homogeneous. Because of the development in the field of computers and taking the principle of the integration of sciences it has been found that modern algorithms used in the field of Computer Science genetic algorithm or ant colo
... Show MoreScheduling considered being one of the most fundamental and essential bases of the project management. Several methods are used for project scheduling such as CPM, PERT and GERT. Since too many uncertainties are involved in methods for estimating the duration and cost of activities, these methods lack the capability of modeling practical projects. Although schedules can be developed for construction projects at early stage, there is always a possibility for unexpected material or technical shortages during construction stage. The objective of this research is to build a fuzzy mathematical model including time cost tradeoff and resource constraints analysis to be applied concurrently. The proposed model has been formulated using fuzzy the
... Show More
It is considered as one of the statistical methods used to describe and estimate the relationship between randomness (Y) and explanatory variables (X). The second is the homogeneity of the variance, in which the dependent variable is a binary response takes two values (One when a specific event occurred and zero when that event did not happen) such as (injured and uninjured, married and unmarried) and that a large number of explanatory variables led to the emergence of the problem of linear multiplicity that makes the estimates inaccurate, and the method of greatest possibility and the method of declination of the letter was used in estimating A double-response logistic regression model by adopting the Jackna
... Show More
It is considered as one of the statistical methods used to describe and estimate the relationship between randomness (Y) and explanatory variables (X). The second is the homogeneity of the variance, in which the dependent variable is a binary response takes two values (One when a specific event occurred and zero when that event did not happen) such as (injured and uninjured, married and unmarried) and that a large number of explanatory variables led to the emergence of the problem of linear multiplicity that makes the estimates inaccurate, and the method of greatest possibility and the method of declination of the letter was used in estimating A double-response logistic regression model by adopting the Jackna
... Show MoreIn this work, radius of shock wave of plasma plume (R) and speed of plasma (U) have been calculated theoretically using Matlab program.
In this article, we developed a new loss function, as the simplification of linear exponential loss function (LINEX) by weighting LINEX function. We derive a scale parameter, reliability and the hazard functions in accordance with upper record values of the Lomax distribution (LD). To study a small sample behavior performance of the proposed loss function using a Monte Carlo simulation, we make a comparison among maximum likelihood estimator, Bayesian estimator by means of LINEX loss function and Bayesian estimator using square error loss (SE) function. The consequences have shown that a modified method is the finest for valuing a scale parameter, reliability and hazard functions.