Database is characterized as an arrangement of data that is sorted out and disseminated in a way that allows the client to get to the data being put away in a simple and more helpful way. However, in the era of big-data the traditional methods of data analytics may not be able to manage and process the large amount of data. In order to develop an efficient way of handling big-data, this work studies the use of Map-Reduce technique to handle big-data distributed on the cloud. This approach was evaluated using Hadoop server and applied on EEG Big-data as a case study. The proposed approach showed clear enhancement for managing and processing the EEG Big-data with average of 50% reduction on response time. The obtained results provide EEG r
... Show MoreThe problem of Multicollinearity is one of the most common problems, which deal to a large extent with the internal correlation between explanatory variables. This problem is especially Appear in economics and applied research, The problem of Multicollinearity has a negative effect on the regression model, such as oversized variance degree and estimation of parameters that are unstable when we use the Least Square Method ( OLS), Therefore, other methods were used to estimate the parameters of the negative binomial model, including the estimated Ridge Regression Method and the Liu type estimator, The negative binomial regression model is a nonline
... Show MoreABSTRACT
This study aimed to choose top stocks through technical analysis tools specially the indicator called (ratio of William index), and test the ability of technical analysis tools in building a portfolio of shares efficient in comparison with the market portfolio. These one technical tools were used for building one portfolios in 21 companies on specific preview conditions and choose 10 companies for the period from (March 2015) to (June 2017). Applied results of the research showed that Portfolio yield for companies selected according to the ratio of William index indicator (0.0406) that
... Show MoreThis paper presents a hybrid approach for solving null values problem; it hybridizes rough set theory with intelligent swarm algorithm. The proposed approach is a supervised learning model. A large set of complete data called learning data is used to find the decision rule sets that then have been used in solving the incomplete data problem. The intelligent swarm algorithm is used for feature selection which represents bees algorithm as heuristic search algorithm combined with rough set theory as evaluation function. Also another feature selection algorithm called ID3 is presented, it works as statistical algorithm instead of intelligent algorithm. A comparison between those two approaches is made in their performance for null values estima
... Show MoreIn this paper we used frequentist and Bayesian approaches for the linear regression model to predict future observations for unemployment rates in Iraq. Parameters are estimated using the ordinary least squares method and for the Bayesian approach using the Markov Chain Monte Carlo (MCMC) method. Calculations are done using the R program. The analysis showed that the linear regression model using the Bayesian approach is better and can be used as an alternative to the frequentist approach. Two criteria, the root mean square error (RMSE) and the median absolute deviation (MAD) were used to compare the performance of the estimates. The results obtained showed that the unemployment rates will continue to increase in the next two decade
... Show MoreAbstract:
In this research we discussed the parameter estimation and variable selection in Tobit quantile regression model in present of multicollinearity problem. We used elastic net technique as an important technique for dealing with both multicollinearity and variable selection. Depending on the data we proposed Bayesian Tobit hierarchical model with four level prior distributions . We assumed both tuning parameter are random variable and estimated them with the other unknown parameter in the model .Simulation study was used for explain the efficiency of the proposed method and then we compared our approach with (Alhamzwi 2014 & standard QR) .The result illustrated that our approach
... Show MoreSewer systems are used to convey sewage and/or storm water to sewage treatment plants for disposal by a network of buried sewer pipes, gutters, manholes and pits. Unfortunately, the sewer pipe deteriorates with time leading to the collapsing of the pipe with traffic disruption or clogging of the pipe causing flooding and environmental pollution. Thus, the management and maintenance of the buried pipes are important tasks that require information about the changes of the current and future sewer pipes conditions. In this research, the study was carried on in Baghdad, Iraq and two deteriorations model's multinomial logistic regression and neural network deterioration model NNDM are used to predict sewers future conditions. The results of the
... Show MoreIn this research، a comparison has been made between the robust estimators of (M) for the Cubic Smoothing Splines technique، to avoid the problem of abnormality in data or contamination of error، and the traditional estimation method of Cubic Smoothing Splines technique by using two criteria of differentiation which are (MADE، WASE) for different sample sizes and disparity levels to estimate the chronologically different coefficients functions for the balanced longitudinal data which are characterized by observations obtained through (n) from the independent subjects، each one of them is measured repeatedly by group of specific time points (m)،since the frequent measurements within the subjects are almost connected an
... Show MoreThe research aims to identify the importance of using the style of the cost on the basis of activity -oriented in time TDABC and its role in determining the cost of products more equitably and thus its impact on the policy of allocation of resources through the reverse of the changes that occur on an ongoing basis in the specification of the products and thus the change in the nature and type of operations . The research was conducted at the General Company for Textile Industries Wasit / knitting socks factory was based on research into the hypothesis main of that ( possible to calculate the cost of activities that cause the production through the time it takes to run these activities can then be re- distributed product cost
... Show More