Big data analysis is essential for modern applications in areas such as healthcare, assistive technology, intelligent transportation, environment and climate monitoring. Traditional algorithms in data mining and machine learning do not scale well with data size. Mining and learning from big data need time and memory efficient techniques, albeit the cost of possible loss in accuracy. We have developed a data aggregation structure to summarize data with large number of instances and data generated from multiple data sources. Data are aggregated at multiple resolutions and resolution provides a trade-off between efficiency and accuracy. The structure is built once, updated incrementally, and serves as a common data input for multiple mining an
... Show MoreIn this research, the semiparametric Bayesian method is compared with the classical method to estimate reliability function of three systems : k-out of-n system, series system, and parallel system. Each system consists of three components, the first one represents the composite parametric in which failure times distributed as exponential, whereas the second and the third components are nonparametric ones in which reliability estimations depend on Kernel method using two methods to estimate bandwidth parameter h method and Kaplan-Meier method. To indicate a better method for system reliability function estimation, it has be
... Show MoreWireless sensor applications are susceptible to energy constraints. Most of the energy is consumed in communication between wireless nodes. Clustering and data aggregation are the two widely used strategies for reducing energy usage and increasing the lifetime of wireless sensor networks. In target tracking applications, large amount of redundant data is produced regularly. Hence, deployment of effective data aggregation schemes is vital to eliminate data redundancy. This work aims to conduct a comparative study of various research approaches that employ clustering techniques for efficiently aggregating data in target tracking applications as selection of an appropriate clustering algorithm may reflect positive results in the data aggregati
... Show MoreA two time step stochastic multi-variables multi-sites hydrological data forecasting model was developed and verified using a case study. The philosophy of this model is to use the cross-variables correlations, cross-sites correlations and the two steps time lag correlations simultaneously, for estimating the parameters of the model which then are modified using the mutation process of the genetic algorithm optimization model. The objective function that to be minimized is the Akiake test value. The case study is of four variables and three sites. The variables are the monthly air temperature, humidity, precipitation, and evaporation; the sites are Sulaimania, Chwarta, and Penjwin, which are located north Iraq. The model performance was
... Show MoreThe study was carried out by reinforcing the resin matrix material
which was (Epoxy- Ep828) by useing Kevlar fibers and glass fibers type (E-gl ass) both or them in the form of woven roving and poly propylene tlbcrs in the form chopped strand mats. wi th (30%) volume fraction. Some mechan i cal properties of the were prepared composite specimens U ltraviolet radiation were stuied after being subjected to different weathering conditi ons i ncluded. Compression and hardness testing were carried out using Briel! method so as to compare between composite behavior i n the environments previously mentioned .
<
... Show MoreThe increasing Global Competitive and the continuous improvement in information technology has led the way to the development of the modern systems and using modern techniques. One of these techniques is benchmarking style and Total Quality Management all of them are used to improve the production process and target rid from the losts on the other side.
The Benchmarking style has become a very important for all the industrial systems and the serving systems as well. And an instrument to improve their performance specially those which are suffering from the highness of the costs or waste in time on the other side.
This study aims to depend on virtual Benchmarking style in the eval
... Show MorePoverty is defined as a low standard of living in the sense that a poor person can not afford a minimum standard of living. The phenomenon of poverty is one of the most serious problems that must be dealt with seriously. This phenomenon has persisted in Iraq for decades because of the harsh economic conditions and unstable security conditions due to the crises it has faced since 2013. This study requires much study and analysis. And rural areas as a special case. In this study, the researcher examined the poverty line as a criterion in estimating the poverty indicators, which include (poverty percentage H, poverty gap PG, poverty intensity PS), based on the continuous social and economic survey data for households in 2014. The ma
... Show MoreIn this research, the influence of the distance factor on the optimal working
frequency (FOT) parameter has been studied theoretically for the ionosphere layer
over the Middle East Zone. The datasets of the (FOT) parameter have been
generated using the (VOACAP) model which considers as one of the recommended
modern international communication models that used to calculate the ionosphere
parameters. The calculations have been made for the connection links between the
capital Baghdad and many other locations that distributed on different distances and
directions over the Middle East region. The years (2011-2013) of the solar cycle 24
have been adopted to study the influence of the distance factor on the FOT
param