Big data analysis has important applications in many areas such as sensor networks and connected healthcare. High volume and velocity of big data bring many challenges to data analysis. One possible solution is to summarize the data and provides a manageable data structure to hold a scalable summarization of data for efficient and effective analysis. This research extends our previous work on developing an effective technique to create, organize, access, and maintain summarization of big data and develops algorithms for Bayes classification and entropy discretization of large data sets using the multi-resolution data summarization structure. Bayes classification and data discretization play essential roles in many learning algorithms such as decision tree and nearest neighbor search. The proposed method can handle streaming data efficiently and, for entropy discretization, provide su the optimal split value.
In this study, SnO2 nanoparticles were prepared from cost-low tin chloride (SnCl2.2H2O) and ethanol by adding ammonia solution by the sol-gel method, which is one of the lowest-cost and simplest techniques. The SnO2 nanoparticles were dried in a drying oven at a temperature of 70°C for 7 hours. After that, it burned in an oven at a temperature of 200°C for 24 hours. The structure, material, morphological, and optical properties of the synthesized SnO2 in nanoparticle sizes are studied utilizing X-ray diffraction. The Scherrer expression was used to compute nanoparticle sizes according to X-ray diffraction, and the results needed to be scrutinized more closely. The micro-strain indicates the broadening of diffraction peaks for nano
... Show MoreTau-P linear noise attenuation filter (TPLNA) was applied on the 3D seismic data of Al-Samawah area south west of Iraq with the aim of attenuating linear noise. TPLNA transforms the data from time domain to tau-p domain in order to increase signal to noise ratio. Applying TPLNA produced very good results considering the 3D data that usually have a large amount of linear noise from different sources and in different azimuths and directions. This processing is very important in later interpretation due to the fact that the signal was covered by different kinds of noise in which the linear noise take a large part.
Tectonically, the location of the Al-Ma'aniyah depression area is far from active boundary zones, their tectonic features have to reflect the original depositional environments with some horizontal movement due to rearrangement of the basement blocks during different actives orogenic movements. So, the analysis of aeromagnetic data were considered to estimate the thickness and structural pattern of the sedimentary cover sequences for this area. The aeromagnetic data, which are derived from Iraqi GEOSURV to Al-Ma′aniyah region is analyzed and processed for qualitative and quantitative interpretations. The process includes reducing the aeromagnetic data to pole RTP, separation the aeromagnetic data to regional an
... Show MoreOur research is related to the projective line over the finite field, in this paper, the main purpose is to classify the sets of size K on the projective line PG (1,31), where K = 3,…,7 the number of inequivalent K-set with stabilizer group by using the GAP Program is computed.
In this research, a group of gray texture images of the Brodatz database was studied by building the features database of the images using the gray level co-occurrence matrix (GLCM), where the distance between the pixels was one unit and for four angles (0, 45, 90, 135). The k-means classifier was used to classify the images into a group of classes, starting from two to eight classes, and for all angles used in the co-occurrence matrix. The distribution of the images on the classes was compared by comparing every two methods (projection of one class onto another where the distribution of images was uneven, with one category being the dominant one. The classification results were studied for all cases using the confusion matrix between ev
... Show MoreIn this research, a group of gray texture images of the Brodatz database was studied by building the features database of the images using the gray level co-occurrence matrix (GLCM), where the distance between the pixels was one unit and for four angles (0, 45, 90, 135). The k-means classifier was used to classify the images into a group of classes, starting from two to eight classes, and for all angles used in the co-occurrence matrix. The distribution of the images on the classes was compared by comparing every two methods (projection of one class onto another where the distribution of images was uneven, with one category being the dominant one. The classification results were studied for all cases using the confusion matrix between every
... Show MoreMedical images play a crucial role in the classification of various diseases and conditions. One of the imaging modalities is X-rays which provide valuable visual information that helps in the identification and characterization of various medical conditions. Chest radiograph (CXR) images have long been used to examine and monitor numerous lung disorders, such as tuberculosis, pneumonia, atelectasis, and hernia. COVID-19 detection can be accomplished using CXR images as well. COVID-19, a virus that causes infections in the lungs and the airways of the upper respiratory tract, was first discovered in 2019 in Wuhan Province, China, and has since been thought to cause substantial airway damage, badly impacting the lungs of affected persons.
... Show MoreIn the present work, different remote sensing techniques have been used to analyze remote sensing data spectrally using ENVI software. The majority of algorithms used in the Spectral Processing can be organized as target detection, change detection and classification. In this paper several methods of target detection have been studied such as matched filter and constrained energy minimization.
The water body mapping have been obtained and the results showed changes on the study area through the period 1995-2000. Also the results that obtained from applying constrained energy minimization were more accurate than other method comparing with the real situation.
Data compression offers an attractive approach to reducing communication costs using available bandwidth effectively. It makes sense to pursue research on developing algorithms that can most effectively use available network. It is also important to consider the security aspect of the data being transmitted is vulnerable to attacks. The basic aim of this work is to develop a module for combining the operation of compression and encryption on the same set of data to perform these two operations simultaneously. This is achieved through embedding encryption into compression algorithms since both cryptographic ciphers and entropy coders bear certain resemblance in the sense of secrecy. First in the secure compression module, the given text is p
... Show MoreIn this study, we review the ARIMA (p, d, q), the EWMA and the DLM (dynamic linear moodelling) procedures in brief in order to accomdate the ac(autocorrelation) structure of data .We consider the recursive estimation and prediction algorithms based on Bayes and KF (Kalman filtering) techniques for correlated observations.We investigate the effect on the MSE of these procedures and compare them using generated data.