Association rules mining (ARM) is a fundamental and widely used data mining technique to achieve useful information about data. The traditional ARM algorithms are degrading computation efficiency by mining too many association rules which are not appropriate for a given user. Recent research in (ARM) is investigating the use of metaheuristic algorithms which are looking for only a subset of high-quality rules. In this paper, a modified discrete cuckoo search algorithm for association rules mining DCS-ARM is proposed for this purpose. The effectiveness of our algorithm is tested against a set of well-known transactional databases. Results indicate that the proposed algorithm outperforms the existing metaheuristic methods.
In this paper, an algorithm for reconstruction of a completely lost blocks using Modified
Hybrid Transform. The algorithms examined in this paper do not require a DC estimation
method or interpolation. The reconstruction achieved using matrix manipulation based on
Modified Hybrid transform. Also adopted in this paper smart matrix (Detection Matrix) to detect
the missing blocks for the purpose of rebuilding it. We further asses the performance of the
Modified Hybrid Transform in lost block reconstruction application. Also this paper discusses
the effect of using multiwavelet and 3D Radon in lost block reconstruction.
The research deals with Environmental Management and how to develop its programs with the use of Knowledge Management, the environmental programs that integrate with processes can add strategic value to business through improving rates of resource utilization , efficiencies , reduce waste, use risk management, cut costs, avoid fines and reduce insurance. All these activities and processes can improve it through knowledge management, the optimal usage for all organizations information , employ it in high value and share it among all organizations members who involves in modify its strategy . Choosing suitable environmental management information system, develop it and modify it with organization processes, can greatly serve the en
... Show MoreSteganography is a technique of concealing secret data within other quotidian files of the same or different types. Hiding data has been essential to digital information security. This work aims to design a stego method that can effectively hide a message inside the images of the video file. In this work, a video steganography model has been proposed through training a model to hiding video (or images) within another video using convolutional neural networks (CNN). By using a CNN in this approach, two main goals can be achieved for any steganographic methods which are, increasing security (hardness to observed and broken by used steganalysis program), this was achieved in this work as the weights and architecture are randomized. Thus,
... Show MoreThis work presents the modeling of the electrical response of monocrystalline photovoltaic module by using five parameters model based on manufacture data-sheet of a solar module that measured in stander test conditions (STC) at radiation 1000W/m² and cell temperature 25 . The model takes into account the series and parallel (shunt) resistance of the module. This paper considers the details of Matlab modeling of the solar module by a developed Simulink model using the basic equations, the first approach was to estimate the parameters: photocurrent Iph, saturation current Is, shunt resistance Rsh, series resistance Rs, ideality factor A at stander test condition (STC) by an ite
... Show MoreThis research takes up address the practical side by taking case studies for construction projects that include the various Iraqi governorates, as it includes conducting a field survey to identify the impact of parametric costs on construction projects and compare them with what was reached during the analysis and the extent of their validity and accuracy, as well as adopting the approach of personal interviews to know the reality of the state of construction projects. The results showed, after comparing field data and its measurement in construction projects for the sectors (public and private), the correlation between the expected and actual cost change was (97.8%), and this means that the data can be adopted in the re
... Show MorePerformance of gas-solid spouted bed benefit from solids uniformity structure (UI).Therefore, the focus of this work is to maximize UI across the bed based on process variables. Hence, UI is to be considered as the objective of the optimization process .Three selected process variables are affecting the objective function. These decision variables are: gas velocity, particle density and particle diameter. Steady-state solids concentration measurements were carried out in a narrow 3-inch cylindrical spouted bed made of Plexiglas that used 60° conical shape base. Radial concentration of particles (glass and steel beads) at various bed heights and different flow patterns were measured using sophisticated optical probes. Stochastic Genetic
... Show MoreIn this research, several estimators concerning the estimation are introduced. These estimators are closely related to the hazard function by using one of the nonparametric methods namely the kernel function for censored data type with varying bandwidth and kernel boundary. Two types of bandwidth are used: local bandwidth and global bandwidth. Moreover, four types of boundary kernel are used namely: Rectangle, Epanechnikov, Biquadratic and Triquadratic and the proposed function was employed with all kernel functions. Two different simulation techniques are also used for two experiments to compare these estimators. In most of the cases, the results have proved that the local bandwidth is the best for all the
... Show MoreDatabase is characterized as an arrangement of data that is sorted out and disseminated in a way that allows the client to get to the data being put away in a simple and more helpful way. However, in the era of big-data the traditional methods of data analytics may not be able to manage and process the large amount of data. In order to develop an efficient way of handling big-data, this work studies the use of Map-Reduce technique to handle big-data distributed on the cloud. This approach was evaluated using Hadoop server and applied on EEG Big-data as a case study. The proposed approach showed clear enhancement for managing and processing the EEG Big-data with average of 50% reduction on response time. The obtained results provide EEG r
... Show More