In this research, the nonparametric technique has been presented to estimate the time-varying coefficients functions for the longitudinal balanced data that characterized by observations obtained through (n) from the independent subjects, each one of them is measured repeatedly by group of specific time points (m). Although the measurements are independent among the different subjects; they are mostly connected within each subject and the applied techniques is the Local Linear kernel LLPK technique. To avoid the problems of dimensionality, and thick computation, the two-steps method has been used to estimate the coefficients functions by using the two former technique. Since, the two-steps method depends, in estimation, on (OLS) method, which is sensitive for the existence of abnormality in data or contamination of error; robust methods have been proposed such as LAD & M to strengthen the two-steps method towards the abnormality and contamination of error. In this research imitating experiments have been performed, with verifying the performance of the traditional and robust methods for Local Linear kernel LLPK technique by using two criteria, for different sample sizes and disparity levels.
The study aims to make an in-depth analysis and the financial account components in the Iraqi balance of payments because it reflects the economic center of the country towards outside world, it also helps in making decision about monetary and financial policies, finance and foreign Trade the importance of FDI for Iraq lies as an important sources as wells provides advanced technology and job chances, It also avoids the country negative effects of borrowing processes from abroad . for analyzing direct and indirect foreign investment on the balance of payments and financial account in a period between (2003 to 2015), a community and research sample have been selected, presented in CBI/ Balance of payments. Department,
... Show MoreA simple analytical method was used in the present work for the simultaneous quantification of Ciprofloxacin and Isoniazid in pharmaceutical preparations. UV-Visible spectrophotometry has been applied to quantify these compounds in pure and mixture solutions using the first-order derivative method. The method depends on the first derivative spectrophotometry using zero-cross, peak to baseline, peak to peak and peak area measurements. Good linearity was shown in the concentration range of 2 to 24 µg∙mL-1 for Ciprofloxacin and 2 to 22 µg∙mL-1 for Isoniazid in the mixture, and the correlation coefficients were 0.9990 and 0.9989 respectively using peak area mode. The limits of detection (LOD) and limits of quantification (LOQ) were
... Show MoreThe Electrical power system has become vast and more complex, so it is subjected to sudden changes in load levels. Stability is an important concept which determines the stable operation of the power system. Transient stability analysis has become one of the significant studies in the power system to ensure the system stability to withstand a considerable disturbance. The effect of temporary occurrence can lead to malfunction of electronic control equipment. The application of flexible AC transmission systems (FACTS) devices in the transmission system have introduced several changes in the power system. These changes have a significant impact on the power system protection, due to differences inline impedance, line curre
... Show MoreNuclear emission rates for nucleon-induced reactions are theoretically calculated based on the one-component exciton model that uses state density with non-Equidistance Spacing Model (non-ESM). Fair comparison is made from different state density values that assumed various degrees of approximation formulae, beside the zeroth-order formula corresponding to the ESM. Calculations were made for 96Mo nucleus subjected to (N,N) reaction at Emax=50 MeV. The results showed that the non-ESM treatment for the state density will significantly improve the emission rates calculated for various exciton configurations. Three terms might suffice a proper calculation, but the results kept changing even for ten terms. However, five terms is found to give
... Show MoreNowadays, people's expression on the Internet is no longer limited to text, especially with the rise of the short video boom, leading to the emergence of a large number of modal data such as text, pictures, audio, and video. Compared to single mode data ,the multi-modal data always contains massive information. The mining process of multi-modal information can help computers to better understand human emotional characteristics. However, because the multi-modal data show obvious dynamic time series features, it is necessary to solve the dynamic correlation problem within a single mode and between different modes in the same application scene during the fusion process. To solve this problem, in this paper, a feature extraction framework of
... Show MoreIn this research a recent developed practical modeling technique is applied for the glucose regulation system identification. By using this technique a set of mathematical models is obtained instead of single one to compensate for the loss of information caused by the optimization technique in curve fitting algorithms, the diversity of members inside the single set is interpreted in term of restricted range of its parameters, also a diagnosis criteria is developed for detecting any disorder in the glucose regulation system by investigating the influence of variation of the parameters on the response of the system, this technique is applied in this research practically for 20 cases with association of National Center for
... Show MoreData scarcity is a major challenge when training deep learning (DL) models. DL demands a large amount of data to achieve exceptional performance. Unfortunately, many applications have small or inadequate data to train DL frameworks. Usually, manual labeling is needed to provide labeled data, which typically involves human annotators with a vast background of knowledge. This annotation process is costly, time-consuming, and error-prone. Usually, every DL framework is fed by a significant amount of labeled data to automatically learn representations. Ultimately, a larger amount of data would generate a better DL model and its performance is also application dependent. This issue is the main barrier for
In this work, a test room was built in Baghdad city, with (2*1.5*1.5) m3 in dimensions, while the solar chimneys (SC) were designed with aspect ratio (ar) bigger than 12. Test room was supplied by many solar collectors; vertical single side of air pass with ar equals 25, and tilted 45o double side of air passes with ar equals 50 for each pass, both collectors consist of flat thermal energy storage box collector (TESB) that covered by transparent clear acrylic sheet, third type of collector is array of evacuated tubular collectors with thermosyphon in 45o instelled in the bottom of TESB of vertical SC. The TESB was
... Show More