In this paper the experimentally obtained conditions for the fusion splicing with photonic crystal fibers (PCF) having large mode areas were reported. The physical mechanism of the splice loss and the microhole collapse property of photonic crystal fiber (PCF) were studied. By controlling the arc-power and the arc-time of a conventional electric arc fusion splicer (FSM-60S), the minimum loss of splicing for fusion two conventional single mode fibers (SMF-28) was (0.00dB), which has similar mode field diameter. For splicing PCF (LMA-10) with a conventional single mode fiber (SMF-28), the loss was increased due to the mode field mismatch.
This study was initiated to examine the tomato-infecting viruses belonging to the Tobamovirus and Potexvirus genera in Iraq. Field observations and surveys were carried out for three successive cropping seasons (2020/21 to 2022/23) in selected tomato production areas. The purpose was to identify the main viruses associated with tomato epidemics and assess the impact of different tomato cultivars on disease occurrence. A total of 700 tomato leaf samples were collected from seven governorates (Baghdad, Diyala, Babylon, Najaf, Kerbala, Nasiriya, and Basrah) and tested using pathogen-specific immunostrip kits. The survey showed a presence of Tomato brown rugose fruit virus (ToBRFV), Tobacco mosaic virus (TMV), Pepper mild mottle virus (
... Show MoreThe aim of this essay is to use a single-index model in developing and adjusting Fama-MacBeth. Penalized smoothing spline regression technique (SIMPLS) foresaw this adjustment. Two generalized cross-validation techniques, Generalized Cross Validation Grid (GGCV) and Generalized Cross Validation Fast (FGCV), anticipated the regular value of smoothing covered under this technique. Due to the two-steps nature of the Fama-MacBeth model, this estimation generated four estimates: SIMPLS(FGCV) - SIMPLS(FGCV), SIMPLS(FGCV) - SIM PLS(GGCV), SIMPLS(GGCV) - SIMPLS(FGCV), SIM PLS(GGCV) - SIM PLS(GGCV). Three-factor Fama-French model—market risk premium, size factor, value factor, and their implication for excess stock returns and portfolio return
... Show MoreThis research includes the study of dual data models with mixed random parameters, which contain two types of parameters, the first is random and the other is fixed. For the random parameter, it is obtained as a result of differences in the marginal tendencies of the cross sections, and for the fixed parameter, it is obtained as a result of differences in fixed limits, and random errors for each section. Accidental bearing the characteristic of heterogeneity of variance in addition to the presence of serial correlation of the first degree, and the main objective in this research is the use of efficient methods commensurate with the paired data in the case of small samples, and to achieve this goal, the feasible general least squa
... Show MoreEncryption of data is translating data to another shape or symbol which enables people only with an access to the secret key or a password that can read it. The data which are encrypted are generally referred to as cipher text, while data which are unencrypted are known plain text. Entropy can be used as a measure which gives the number of bits that are needed for coding the data of an image. As the values of pixel within an image are dispensed through further gray-levels, the entropy increases. The aim of this research is to compare between CAST-128 with proposed adaptive key and RSA encryption methods for video frames to determine the more accurate method with highest entropy. The first method is achieved by applying the "CAST-128" and
... Show MoreThe Log-Logistic distribution is one of the important statistical distributions as it can be applied in many fields and biological experiments and other experiments, and its importance comes from the importance of determining the survival function of those experiments. The research will be summarized in making a comparison between the method of maximum likelihood and the method of least squares and the method of weighted least squares to estimate the parameters and survival function of the log-logistic distribution using the comparison criteria MSE, MAPE, IMSE, and this research was applied to real data for breast cancer patients. The results showed that the method of Maximum likelihood best in the case of estimating the paramete
... Show MoreThe biggest problem of structural materials for fusion reactor is the damage caused by the fusion product neutrons to the structural material. If this problem is overcomed, an important milestone will be left behind in fusion energy. One of the important problems of the structural material is that nuclei forming the structural material interacting with fusion neutrons are transmuted to stable or radioactive nuclei via (n, x) (x; alpha, proton, gamma etc.) reactions. In particular, the concentration of helium gas in the structural material increases through deuteron- tritium (D-T) and (n, α) reactions, and this increase significantly changes the microstructure and the properties of the structural materials. T
... Show MoreObjective(s): To determine the impact of psychological distress in women upon coping with breast cancer.
Methodology: A descriptive design is carried throughout the present study. Convenient sample of (60) woman with breast cancer is recruited from the community. Two instruments, psychological distress scale and coping scale are developed for the study. Internal consistency reliability and content validity are obtained for the study instruments. Data are collect through the application of the study instruments. Data are analyzed through the use of descriptive statistical data analysis approach and inferential statistical data analysis approach.
Results: The study findings depict that women with breast cancer have experien
... Show MoreIn this study, we made a comparison between LASSO & SCAD methods, which are two special methods for dealing with models in partial quantile regression. (Nadaraya & Watson Kernel) was used to estimate the non-parametric part ;in addition, the rule of thumb method was used to estimate the smoothing bandwidth (h). Penalty methods proved to be efficient in estimating the regression coefficients, but the SCAD method according to the mean squared error criterion (MSE) was the best after estimating the missing data using the mean imputation method
The purpose of this paper is applying the robustness in Linear programming(LP) to get rid of uncertainty problem in constraint parameters, and find the robust optimal solution, to maximize the profits of the general productive company of vegetable oils for the year 2019, through the modify on a mathematical model of linear programming when some parameters of the model have uncertain values, and being processed it using robust counterpart of linear programming to get robust results from the random changes that happen in uncertain values of the problem, assuming these values belong to the uncertainty set and selecting the values that cause the worst results and to depend buil
... Show MoreIn this paper, the Magnetohydrodynamic (MHD) for Williamson fluid with varying temperature and concentration in an inclined channel with variable viscosity has been examined. The perturbation technique in terms of the Weissenberg number to obtain explicit forms for the velocity field has been used. All the solutions of physical parameters of the Darcy parameter , Reynolds number , Peclet number and Magnetic parameter are discussed under the different values as shown in plots.