Average interstellar extinction curves for Galaxy and Large Magellanic Cloud (LMC) over the range of wavelengths (1100 A0 – 3200 A0) were obtained from observations via IUE satellite. The two extinctions of our galaxy and LMC are normalized to Av=0 and E (B-V)=1, to meat standard criteria. It is found that the differences between the two extinction curves appeared obviously at the middle and far ultraviolet regions due to the presence of different populations of small grains, which have very little contribution at longer wavelengths. Using new IUE-Reduction techniques lead to more accurate result.
Abstract
The research aimed to test the relationship between the size of investment allocations in the agricultural sector in Iraq and their determinants using the Ordinary Least Squares (OLS) method compared to the Error Correction Model (ECM) approach. The time series data for the period from 1990 to 2021 was utilized. The analysis showed that the estimates obtained using the ECM were more accurate and significant than those obtained using the OLS method. Johansen's test indicated the presence of a long-term equilibrium relationship between the size of investment allocations and their determinants. The results of th
... Show MoreIn this paper, a Monte Carlo Simulation technique is used to compare the performance of the standard Bayes estimators of the reliability function of the one parameter exponential distribution .Three types of loss functions are adopted, namely, squared error loss function (SELF) ,Precautionary error loss function (PELF) andlinear exponential error loss function(LINEX) with informative and non- informative prior .The criterion integrated mean square error (IMSE) is employed to assess the performance of such estimators
Some experiments need to know the extent of their usefulness to continue providing them or not. This is done through the fuzzy regression discontinuous model, where the Epanechnikov Kernel and Triangular Kernel were used to estimate the model by generating data from the Monte Carlo experiment and comparing the results obtained. It was found that the. Epanechnikov Kernel has a least mean squared error.
In recent years, the migration of the computational workload to computational clouds has attracted intruders to target and exploit cloud networks internally and externally. The investigation of such hazardous network attacks in the cloud network requires comprehensive network forensics methods (NFM) to identify the source of the attack. However, cloud computing lacks NFM to identify the network attacks that affect various cloud resources by disseminating through cloud networks. In this paper, the study is motivated by the need to find the applicability of current (C-NFMs) for cloud networks of the cloud computing. The applicability is evaluated based on strengths, weaknesses, opportunities, and threats (SWOT) to outlook the cloud network. T
... Show MoreIn this paper the experimentally obtained conditions for the fusion splicing with photonic crystal fibers (PCF) having large mode areas were reported. The physical mechanism of the splice loss and the microhole collapse property of photonic crystal fiber (PCF) were studied. By controlling the arc-power and the arc-time of a conventional electric arc fusion splicer (FSM-60S), the minimum loss of splicing for fusion two conventional single mode fibers (SMF-28) was (0.00dB), which has similar mode field diameter. For splicing PCF (LMA-10) with a conventional single mode fiber (SMF-28), the loss was increased due to the mode field mismatch.
The hydraulic behavior of the flow can be changed by using large-scale geometric roughness elements in open channels. This change can help in controlling erosions and sedimentations along the mainstream of the channel. Roughness elements can be large stone or concrete blocks placed at the channel's bed to impose more resistance in the bed. The geometry of the roughness elements, numbers used, and configuration are parameters that can affect the flow's hydraulic characteristics. In this paper, velocity distribution along the flume was theoretically investigated using a series of tests of T-shape roughness elements, fixed height, arranged in three different configurations, differ in the number of lines of roughness element
... Show MoreThe current study aims to compare between the assessments of the Rush model’s parameters to the missing and completed data in various ways of processing the missing data. To achieve the aim of the present study, the researcher followed the following steps: preparing Philip Carter test for the spatial capacity which consists of (20) items on a group of (250) sixth scientific stage students in the directorates of Baghdad Education at Al–Rusafa (1st, 2nd and 3rd) for the academic year (2018-2019). Then, the researcher relied on a single-parameter model to analyze the data. The researcher used Bilog-mg3 model to check the hypotheses, data and match them with the model. In addition
... Show More