Visual analytics becomes an important approach for discovering patterns in big data. As visualization struggles from high dimensionality of data, issues like concept hierarchy on each dimension add more difficulty and make visualization a prohibitive task. Data cube offers multi-perspective aggregated views of large data sets and has important applications in business and many other areas. It has high dimensionality, concept hierarchy, vast number of cells, and comes with special exploration operations such as roll-up, drill-down, slicing and dicing. All these issues make data cubes very difficult to visually explore. Most existing approaches visualize a data cube in 2D space and require preprocessing steps. In this paper, we propose a visualization technique for visual analytics of data cubes using parallel coordinates. The proposed technique extends parallel coordinates to a 3D space to reflect concept hierarchy …
Data hiding is the process of encoding extra information in an image by making small modification to its pixels. To be practical, the hidden data must be perceptually invisible yet robust to common signal processing operations. This paper introduces a scheme for hiding a signature image that could be as much as 25% of the host image data and hence could be used both in digital watermarking as well as image/data hiding. The proposed algorithm uses orthogonal discrete wavelet transforms with two zero moments and with improved time localization called discrete slantlet transform for both host and signature image. A scaling factor ? in frequency domain control the quality of the watermarked images. Experimental results of signature image
... Show MoreData compression offers an attractive approach to reducing communication costs using available bandwidth effectively. It makes sense to pursue research on developing algorithms that can most effectively use available network. It is also important to consider the security aspect of the data being transmitted is vulnerable to attacks. The basic aim of this work is to develop a module for combining the operation of compression and encryption on the same set of data to perform these two operations simultaneously. This is achieved through embedding encryption into compression algorithms since both cryptographic ciphers and entropy coders bear certain resemblance in the sense of secrecy. First in the secure compression module, the given text is p
... Show MoreA loS.sless (reversible) data hiding (embedding) method inside an image (translating medium) - presented in the present work using L_SB (least significant bit). technique which enables us to translate data using an image (host image), using a secret key, to be undetectable without losing any data or without changing the size and the external scene (visible properties) of the image, the hid-ing data is then can be extracted (without losing) by reversing &n
... Show MoreIn this study, we review the ARIMA (p, d, q), the EWMA and the DLM (dynamic linear moodelling) procedures in brief in order to accomdate the ac(autocorrelation) structure of data .We consider the recursive estimation and prediction algorithms based on Bayes and KF (Kalman filtering) techniques for correlated observations.We investigate the effect on the MSE of these procedures and compare them using generated data.
Blockchain technology relies on cryptographic techniques that provide various advantages, such as trustworthiness, collaboration, organization, identification, integrity, and transparency. Meanwhile, data analytics refers to the process of utilizing techniques to analyze big data and comprehend the relationships between data points to draw meaningful conclusions. The field of data analytics in Blockchain is relatively new, and few studies have been conducted to examine the challenges involved in Blockchain data analytics. This article presents a systematic analysis of how data analytics affects Blockchain performance, with the aim of investigating the current state of Blockchain-based data analytics techniques in research fields and
... Show MoreThe current study aims to compare between the assessments of the Rush model’s parameters to the missing and completed data in various ways of processing the missing data. To achieve the aim of the present study, the researcher followed the following steps: preparing Philip Carter test for the spatial capacity which consists of (20) items on a group of (250) sixth scientific stage students in the directorates of Baghdad Education at Al–Rusafa (1st, 2nd and 3rd) for the academic year (2018-2019). Then, the researcher relied on a single-parameter model to analyze the data. The researcher used Bilog-mg3 model to check the hypotheses, data and match them with the model. In addition
... Show MoreThe electron correlation effect for inter-shell have been analysed in terms of Fermi hole and partial Fermi hole for Li-atom in the excited states (1s2 3p) and (1s2 3d) using Hartree-Fock approximation (HF). Fermi hole Δf(r12) and partial Fermi hole Δg(r12 ,r1) were determined in position space. Each plot of the physical properties in this work is normalized to unity. The calculation was performed using Mathcad 14 program.
The first flow injection spectrophotometric method is characterized by its speed and sensitivity which have been developed for the determination of promethazine-HCl in pure and pharmaceutical preparation. It is based on the in situ detection of colored cationic radicals formed via oxidation of the drug with sodium persulphate to pinkish-red species and the same species was determined by using homemade Ayah 3SX3-3D solar flow injection photometer. Optimum conditions were obtained by using the high intensive green light emitted diode as a source. Linear dynamic range for the absorbance versus promethazine-HCl concentration was 0-7 mmol.L-1, with the correlation coefficient (r) was 0.9904 while the percentage linearity (r2%) was 98.09%. the L.
... Show More