Big data analysis has important applications in many areas such as sensor networks and connected healthcare. High volume and velocity of big data bring many challenges to data analysis. One possible solution is to summarize the data and provides a manageable data structure to hold a scalable summarization of data for efficient and effective analysis. This research extends our previous work on developing an effective technique to create, organize, access, and maintain summarization of big data and develops algorithms for Bayes classification and entropy discretization of large data sets using the multi-resolution data summarization structure. Bayes classification and data discretization play essential roles in many learning algorithms such as decision tree and nearest neighbor search. The proposed method can handle streaming data efficiently and, for entropy discretization, provide su the optimal split value.
Traffic classification is referred to as the task of categorizing traffic flows into application-aware classes such as chats, streaming, VoIP, etc. Most systems of network traffic identification are based on features. These features may be static signatures, port numbers, statistical characteristics, and so on. Current methods of data flow classification are effective, they still lack new inventive approaches to meet the needs of vital points such as real-time traffic classification, low power consumption, ), Central Processing Unit (CPU) utilization, etc. Our novel Fast Deep Packet Header Inspection (FDPHI) traffic classification proposal employs 1 Dimension Convolution Neural Network (1D-CNN) to automatically learn more representational c
... Show MoreThe Diffie-Hellman is a key exchange protocol to provide a way to transfer shared secret keys between two parties, although those parties might never have communicated together. This paper suggested a new way to transfer keys through public or non-secure channels depending on the sent video files over the channel and then extract keys. The proposed method of key generation depends on the video file content by using the entropy value of the video frames. The proposed system solves the weaknesses in the Diffie-Hellman key exchange algorithm, which is MIMA (Man-in-the-Middle attack) and DLA( Discrete logarithm attack). When the method used high definition videos with a vast amount of data, the keys generated with a large number up to 5
... Show MoreThe goal of this research is to introduce the concepts of Large-small submodule and Large-hollow module and some properties of them are considered, such that a proper submodule N of an R-module M is said to be Large-small submodule, if N + K = M where K be a submodule of M, then K is essential submodule of M ( K ≤e M ). An R-module M is called Large-hollow module if every proper submodule of M is Large-small submodule in M.
In this paper, a discretization of a three-dimensional fractional-order prey-predator model has been investigated with Holling type III functional response. All its fixed points are determined; also, their local stability is investigated. We extend the discretized system to an optimal control problem to get the optimal harvesting amount. For this, the discrete-time Pontryagin’s maximum principle is used. Finally, numerical simulation results are given to confirm the theoretical outputs as well as to solve the optimality problem.
In this paper we study the relation between the resolution of Weyl Module F K ) 3 , 4 , 4 (
in
characteristic-free mode and in the Lascoux mode (characteristic zero), more precisely we
obtain the Lascoux resolution of F K ) 3 , 4 , 4 (
in characteristic zero as an application of the
resolution of F K ) 3 , 4 , 4 (
in characteristic-free.
Key word : Resolution, Weyl module, Lascoux mode, divided power, characteristic-free.
Autism is a lifelong developmental deficit that affects how people perceive the world and interact with each others. An estimated one in more than 100 people has autism. Autism affects almost four times as many boys than girls. The commonly used tools for analyzing the dataset of autism are FMRI, EEG, and more recently "eye tracking". A preliminary study on eye tracking trajectories of patients studied, showed a rudimentary statistical analysis (principal component analysis) provides interesting results on the statistical parameters that are studied such as the time spent in a region of interest. Another study, involving tools from Euclidean geometry and non-Euclidean, the trajectory of eye patients also showed interesting results. In this
... Show More