Visual analytics becomes an important approach for discovering patterns in big data. As visualization struggles from high dimensionality of data, issues like concept hierarchy on each dimension add more difficulty and make visualization a prohibitive task. Data cube offers multi-perspective aggregated views of large data sets and has important applications in business and many other areas. It has high dimensionality, concept hierarchy, vast number of cells, and comes with special exploration operations such as roll-up, drill-down, slicing and dicing. All these issues make data cubes very difficult to visually explore. Most existing approaches visualize a data cube in 2D space and require preprocessing steps. In this paper, we propose a visualization technique for visual analytics of data cubes using parallel coordinates. The proposed technique extends parallel coordinates to a 3D space to reflect concept hierarchy …
In this article, we design an optimal neural network based on new LM training algorithm. The traditional algorithm of LM required high memory, storage and computational overhead because of it required the updated of Hessian approximations in each iteration. The suggested design implemented to converts the original problem into a minimization problem using feed forward type to solve non-linear 3D - PDEs. Also, optimal design is obtained by computing the parameters of learning with highly precise. Examples are provided to portray the efficiency and applicability of this technique. Comparisons with other designs are also conducted to demonstrate the accuracy of the proposed design.
In today's digital era, the importance of securing information has reached critical levels. Steganography is one of the methods used for this purpose by hiding sensitive data within other files. This study introduces an approach utilizing a chaotic dynamic system as a random key generator, governing both the selection of hiding locations within an image and the amount of data concealed in each location. The security of the steganography approach is considerably improved by using this random procedure. A 3D dynamic system with nine parameters influencing its behavior was carefully chosen. For each parameter, suitable interval values were determined to guarantee the system's chaotic behavior. Analysis of chaotic performance is given using the
... Show MoreThere is much research on the syntax-semantics and the syntax-phonology interaction. However, the exact relation between prosodic patterns and informational structure (as part of pragmatics) is still to be investigated. In this empirical study, we challenge the view that prosody and pragmatics are two autonomous levels of grammar. This paper is an analysis of the narrative poem ‘Mending Wall’ recited by Robert Frost to explore the prosodic features and the associated pragmatic meanings. It is proposed that a set of intentionally manipulated suprasegmental features form a prosodic grammar that works in line with syntax and lexical choices to build the narrative discourse and achieve pragmatic meanings. The paper shows that the am
... Show Moremajor goal of the next-generation wireless communication systems is the development of a reliable high-speed wireless communication system that supports high user mobility. They must focus on increasing the link throughput and the network capacity. In this paper a novel, spectral efficient system is proposed for generating and transmitting twodimensional (2-D) orthogonal frequency division multiplexing (OFDM) symbols through 2- D inter-symbol interference (ISI) channel. Instead of conventional data mapping techniques, discrete finite Radon transform (FRAT) is used as a data mapping technique due to the increased orthogonality offered. As a result, the proposed structure gives a significant improvement in bit error rate (BER) performance. Th
... Show MoreThis research aims to analyze and simulate biochemical real test data for uncovering the relationships among the tests, and how each of them impacts others. The data were acquired from Iraqi private biochemical laboratory. However, these data have many dimensions with a high rate of null values, and big patient numbers. Then, several experiments have been applied on these data beginning with unsupervised techniques such as hierarchical clustering, and k-means, but the results were not clear. Then the preprocessing step performed, to make the dataset analyzable by supervised techniques such as Linear Discriminant Analysis (LDA), Classification And Regression Tree (CART), Logistic Regression (LR), K-Nearest Neighbor (K-NN), Naïve Bays (NB
... Show More