Big data analysis has important applications in many areas such as sensor networks and connected healthcare. High volume and velocity of big data bring many challenges to data analysis. One possible solution is to summarize the data and provides a manageable data structure to hold a scalable summarization of data for efficient and effective analysis. This research extends our previous work on developing an effective technique to create, organize, access, and maintain summarization of big data and develops algorithms for Bayes classification and entropy discretization of large data sets using the multi-resolution data summarization structure. Bayes classification and data discretization play essential roles in many learning algorithms such as decision tree and nearest neighbor search. The proposed method can handle streaming data efficiently and, for entropy discretization, provide su the optimal split value.
Four electrodes were synthesized based on molecularly imprinted polymers (MIPs). Two MIPs were prepared by using the diclofenac sodium (DFS) as the template, 2-hydroxy ethyl metha acrylate(2-HEMA) and 2-vinyl pyridine(2-VP) as monomers as well as divinyl benzene and benzoyl peroxide as cross linker and initiator respectively. The same composition used for prepared non-imprinted polymers (NIPs) but without the template (diclofenac sodium). To prepared the membranes electrodes used different plasticizers in PVC matrix such as: tris(2-ethyl hexyl) phosphate (TEHP), tri butyl phosphate (TBP), bis(2-ethyl hexyl) adipate (BEHA) and tritolyl phosphate (TTP). The characteristics studied the slop, detection limit, life time and linearity range of DF
... Show MoreThe Electro-Fenton oxidation process is one of the essential advanced electrochemical oxidation processes used to treat Phenol and its derivatives in wastewater. The Electro-Fenton oxidation process was carried out at an ambient temperature at different current density (2, 4, 6, 8 mA/cm2) for up to 6 h. Sodium Sulfate at a concentration of 0.05M was used as a supporting electrolyte, and 0.4 mM of Ferrous ion concentration (Fe2+) was used as a catalyst. The electrolyte cell consists of graphite modified by an electrodepositing layer of PbO2 on its surface as anode and carbon fiber modified with Graphene as a cathode. The results indicated that Phenol concentration decreases with an increase in current dens
... Show MoreThe nucleon momentum distributions (NMD) and elastic electron scattering form factors of the ground state for some 1f-2p-shell nuclei, such as 58Ni, 60Ni, 62Ni, and 64Ni
isotopes have been calculated in the framework of the coherent fluctuation model (CFM) and expressed in terms of the weight function lf(x)l2 . The weight function (fluctuation function) has been related to the nucleon density distribution (NDD) of the nuclei and determined from the theory and experiment. The NDD is derived from a simple method based on the use of the single particle wave functions of the harmonic oscillator potential and the occupation numbers of the states. The feature of the l
It is often needed to have circuits that can display the decimal representation of a binary number and specifically in this paper on a 7-segment display. In this paper a circuit that can display the decimal equivalent of an n-bit binary number is designed and it’s behavior is described using Verilog Hardware Descriptive Language (HDL). This HDL program is then used to configure an FPGA to implement the designed circuit.
The research aims to identify the effect of using the strategy of Roundhouse on the achievement of fourth-grade students of computer and their Attitudes towards it. The research sample consisted of (61) fourth-grade secondary school students distributed into the experimental group consisted of (31) students study computer according to the Roundhouse strategy, and the control group consisted of (30) students follow the traditional method. The researcher designed an achievement test consisting of (30) items of multiple choice. To measure the attitudes of students towards the computer, a questionnaire of (32) paragraphs with three alternatives was designed by the researcher. The results showed that there is a statistically significant diffe
... Show MoreCompressing the speech reduces the data storage requirements, leading to reducing the time of transmitting the digitized speech over long-haul links like internet. To obtain best performance in speech compression, wavelet transforms require filters that combine a number of desirable properties, such as orthogonality and symmetry.The MCT bases functions are derived from GHM bases function using 2D linear convolution .The fast computation algorithm methods introduced here added desirable features to the current transform. We further assess the performance of the MCT in speech compression application. This paper discusses the effect of using DWT and MCT (one and two dimension) on speech compression. DWT and MCT performances in terms of comp
... Show MoreThe penalized least square method is a popular method to deal with high dimensional data ,where the number of explanatory variables is large than the sample size . The properties of penalized least square method are given high prediction accuracy and making estimation and variables selection
At once. The penalized least square method gives a sparse model ,that meaning a model with small variables so that can be interpreted easily .The penalized least square is not robust ,that means very sensitive to the presence of outlying observation , to deal with this problem, we can used a robust loss function to get the robust penalized least square method ,and get robust penalized estimator and
... Show MoreDiabetes is one of the increasing chronic diseases, affecting millions of people around the earth. Diabetes diagnosis, its prediction, proper cure, and management are compulsory. Machine learning-based prediction techniques for diabetes data analysis can help in the early detection and prediction of the disease and its consequences such as hypo/hyperglycemia. In this paper, we explored the diabetes dataset collected from the medical records of one thousand Iraqi patients. We applied three classifiers, the multilayer perceptron, the KNN and the Random Forest. We involved two experiments: the first experiment used all 12 features of the dataset. The Random Forest outperforms others with 98.8% accuracy. The second experiment used only five att
... Show MoreEye Detection is used in many applications like pattern recognition, biometric, surveillance system and many other systems. In this paper, a new method is presented to detect and extract the overall shape of one eye from image depending on two principles Helmholtz & Gestalt. According to the principle of perception by Helmholz, any observed geometric shape is perceptually "meaningful" if its repetition number is very small in image with random distribution. To achieve this goal, Gestalt Principle states that humans see things either through grouping its similar elements or recognize patterns. In general, according to Gestalt Principle, humans see things through genera
... Show MoreAs we live in the era of the fourth technological revolution, it has become necessary to use artificial intelligence to generate electric power through sustainable solar energy, especially in Iraq and what it has gone through in terms of crises and what it suffers from a severe shortage of electric power because of the wars and calamities it went through. During that period of time, its impact is still evident in all aspects of daily life experienced by Iraqis because of the remnants of wars, siege, terrorism, wrong policies ruling before and later, regional interventions and their consequences, such as the destruction of electric power stations and the population increase, which must be followed by an increase in electric power stations,
... Show More