Everybody is connected with social media like (Facebook, Twitter, LinkedIn, Instagram…etc.) that generate a large quantity of data and which traditional applications are inadequate to process. Social media are regarded as an important platform for sharing information, opinion, and knowledge of many subscribers. These basic media attribute Big data also to many issues, such as data collection, storage, moving, updating, reviewing, posting, scanning, visualization, Data protection, etc. To deal with all these problems, this is a need for an adequate system that not just prepares the details, but also provides meaningful analysis to take advantage of the difficult situations, relevant to business, proper decision, Health, social media, sc
... Show MoreThe current study aimed to determine the relation between the lead levels in the blood traffic men and the nature of their traffic work in Baghdad governorate. Blood samples were collected from 10 traffic men and the age about from 20-39 year from Directorate of Traffic Al Rusafa/ Baghdad and 10 samples another control from traffic men too with age 30-49 year and they livedrelatively in the clear cities or contained of Very few traffic areas. The levels of lead in blood estimated by used Atomic Absorption Spectrometry.
The result stated that there is no rising of the levels of lead in blood of traffic men Lead concentration was with more a range from 14 ppm in Traffic police are not healthy They are within the permissible limits, Ap
To ascertain the stability or instability of time series, three versions of the model proposed by Dickie-Voller were used in this paper. The aim of this study is to explain the extent of the impact of some economic variables such as the supply of money, gross domestic product, national income, after reaching the stability of these variables. The results show that the variable money supply, the GDP variable, and the exchange rate variable were all stable at the level of the first difference in the time series. This means that the series is an integrated first-class series. Hence, the gross fixed capital formation variable, the variable national income, and the variable interest rate
... Show MoreCanonical correlation analysis is one of the common methods for analyzing data and know the relationship between two sets of variables under study, as it depends on the process of analyzing the variance matrix or the correlation matrix. Researchers resort to the use of many methods to estimate canonical correlation (CC); some are biased for outliers, and others are resistant to those values; in addition, there are standards that check the efficiency of estimation methods.
In our research, we dealt with robust estimation methods that depend on the correlation matrix in the analysis process to obtain a robust canonical correlation coefficient, which is the method of Biwe
... Show MoreThe monthly time series of the Total Suspended Solids (TSS) concentrations in Euphrates River at Nasria was analyzed as a time series. The data used for the analysis was the monthly series during (1977-2000).
The series was tested for nonhomogenity and found to be nonhomogeneous. A significant positive jump was observed after 1988. This nonhomogenity was removed using a method suggested by Yevichevich (7). The homogeneous series was then normalized using Box and Cox (2) transformation. The periodic component of the series was fitted using harmonic analyses, and removed from the series to obtain the dependent stochastic component. This component was then modeled using first order autoregressive model (Markovian chain). The above a
... Show MoreThrough recent years many researchers have developed methods to estimate the self-similarity and long memory parameter that is best known as the Hurst parameter. In this paper, we set a comparison between nine different methods. Most of them use the deviations slope to find an estimate for the Hurst parameter like Rescaled range (R/S), Aggregate Variance (AV), and Absolute moments (AM), and some depend on filtration technique like Discrete Variations (DV), Variance versus level using wavelets (VVL) and Second-order discrete derivative using wavelets (SODDW) were the comparison set by a simulation study to find the most efficient method through MASE. The results of simulation experiments were shown that the performance of the meth
... Show MoreIntroduction:
Many business owners suffer major financial problems during periods of financial stagnation, the decline of markets and businesses, or under the impact of financial shocks for certain reasons that result in large debts and the consequent financial and legal obligations. This is the beginning of a long and endless path of suffering and the search for a safe exit. It is even worse for financial institutions to facilitate financial solutions that rely on lending as a solution to their financial problem. Debt and its consequences increase, and the problem deepens and becomes complicated until things become entangled and the escape or declaration of bank
... Show MoreThe 3-parameter Weibull distribution is used as a model for failure since this distribution is proper when the failure rate somewhat high in starting operation and these rates will be decreased with increasing time .
In practical side a comparison was made between (Shrinkage and Maximum likelihood) Estimators for parameter and reliability function using simulation , we conclude that the Shrinkage estimators for parameters are better than maximum likelihood estimators but the maximum likelihood estimator for reliability function is the better using statistical measures (MAPE)and (MSE) and for different sample sizes.
Note:- ns : small sample ; nm=median sample
... Show MoreIn this paper, an estimate has been made for parameters and the reliability function for Transmuted power function (TPF) distribution through using some estimation methods as proposed new technique for white, percentile, least square, weighted least square and modification moment methods. A simulation was used to generate random data that follow the (TPF) distribution on three experiments (E1 , E2 , E3) of the real values of the parameters, and with sample size (n=10,25,50 and 100) and iteration samples (N=1000), and taking reliability times (0< t < 0) . Comparisons have been made between the obtained results from the estimators using mean square error (MSE). The results showed the
... Show MoreAs they are the smallest functional parts of the muscle, motor units (MUs) are considered as the basic building blocks of the neuromuscular system. Monitoring MU recruitment, de-recruitment, and firing rate (by either invasive or surface techniques) leads to the understanding of motor control strategies and of their pathological alterations. EMG signal decomposition is the process of identification and classification of individual motor unit action potentials (MUAPs) in the interference pattern detected with either intramuscular or surface electrodes. Signal processing techniques were used in EMG signal decomposition to understand fundamental and physiological issues. Many techniques have been developed to decompose intramuscularly detec
... Show More