Big data analysis has important applications in many areas such as sensor networks and connected healthcare. High volume and velocity of big data bring many challenges to data analysis. One possible solution is to summarize the data and provides a manageable data structure to hold a scalable summarization of data for efficient and effective analysis. This research extends our previous work on developing an effective technique to create, organize, access, and maintain summarization of big data and develops algorithms for Bayes classification and entropy discretization of large data sets using the multi-resolution data summarization structure. Bayes classification and data discretization play essential roles in many learning algorithms such as decision tree and nearest neighbor search. The proposed method can handle streaming data efficiently and, for entropy discretization, provide su the optimal split value.
Target tracking is a significant application of wireless sensor networks (WSNs) in which deployment of self-organizing and energy efficient algorithms is required. The tracking accuracy increases as more sensor nodes are activated around the target but more energy is consumed. Thus, in this study, we focus on limiting the number of sensors by forming an ad-hoc network that operates autonomously. This will reduce the energy consumption and prolong the sensor network lifetime. In this paper, we propose a fully distributed algorithm, an Endocrine inspired Sensor Activation Mechanism for multi target-tracking (ESAM) which reflecting the properties of real life sensor activation system based on the information circulating principle in the endocr
... Show MoreMost Internet of Vehicles (IoV) applications are delay-sensitive and require resources for data storage and tasks processing, which is very difficult to afford by vehicles. Such tasks are often offloaded to more powerful entities, like cloud and fog servers. Fog computing is decentralized infrastructure located between data source and cloud, supplies several benefits that make it a non-frivolous extension of the cloud. The high volume data which is generated by vehicles’ sensors and also the limited computation capabilities of vehicles have imposed several challenges on VANETs systems. Therefore, VANETs is integrated with fog computing to form a paradigm namely Vehicular Fog Computing (VFC) which provide low-latency services to mo
... Show MoreObjective: Per-implantitis is one of the implant treatment complications. Dentists have failed to restore damaged periodontium by using conventional therapies. Tissue engineering (stem cells, scaffold and growth factors) aims to reconstruct natural tissues. The paper aimed to isolate both periodontal ligament stem cells (PDLSCs) and bone marrow mesenchymal stem cells (BMMSCs) and use them in a co-culture method to create three-layered cell sheets for reconstructing natural periodontal ligament (PDL) tissue. Materials and methods: BMMSCs were isolated from rabbit tibia and femur, and PDLSC culture was established from the lower right incisor. The cells were co-cultured to induce BMMSC differentiation into PDL cells. Cell morphology, stem cel
... Show MoreCloud computing provides huge amount of area for storage of the data, but with an increase of number of users and size of their data, cloud storage environment faces earnest problem such as saving storage space, managing this large data, security and privacy of data. To save space in cloud storage one of the important methods is data deduplication, it is one of the compression technique that allows only one copy of the data to be saved and eliminate the extra copies. To offer security and privacy of the sensitive data while supporting the deduplication, In this work attacks that exploit the hybrid cloud deduplication have been identified, allowing an attacker to gain access to the files of other users based on very small hash signatures of
... Show MoreThe aesthetic contents of data visualization is one of the contemporary areas through which data scientists and designers have been able to link data to humans, and even after reaching successful attempts to model data visualization, it wasn't clear how that reveals how it contributed to choosing the aesthetic content as an input to humanize these models, so the goal of the current research is to use The analytical descriptive approach aims to identify the aesthetic contents in data visualization, which the researchers interpreted through pragmatic philosophy and Kantian philosophy, and analyze a sample of data visualization models to reveal the aesthetic entrances in them to explain how to humanize them. The two researchers reached seve
... Show MoreInformation systems and data exchange between government institutions are growing rapidly around the world, and with it, the threats to information within government departments are growing. In recent years, research into the development and construction of secure information systems in government institutions seems to be very effective. Based on information system principles, this study proposes a model for providing and evaluating security for all of the departments of government institutions. The requirements of any information system begin with the organization's surroundings and objectives. Most prior techniques did not take into account the organizational component on which the information system runs, despite the relevance of
... Show MoreCloud computing is an interesting technology that allows customers to have convenient, on-demand network connectivity based on their needs with minimal maintenance and contact between cloud providers. The issue of security has arisen as a serious concern, particularly in the case of cloud computing, where data is stored and accessible via the Internet from a third-party storage system. It is critical to ensure that data is only accessible to the appropriate individuals and that it is not stored in third-party locations. Because third-party services frequently make backup copies of uploaded data for security reasons, removing the data the owner submits does not guarantee the removal of the data from the cloud. Cloud data storag
... Show MoreA skip list data structure is really just a simulation of a binary search tree. Skip lists algorithm are simpler, faster and use less space. this data structure conceptually uses parallel sorted linked lists. Searching in a skip list is more difficult than searching in a regular sorted linked list. Because a skip list is a two dimensional data structure, it is implemented using a two dimensional network of nodes with four pointers. the implementation of the search, insert and delete operation taking a time of upto . The skip list could be modified to implement the order statistic operations of RANKand SEARCH BY RANK while maintaining the same expected time. Keywords:skip list , parallel linked list , randomized algorithm , rank.
Thermal energy storage is an important component in energy units to decrease the gap between energy supply and demand. Free convection and the locations of the tubes carrying the heat-transfer fluid (HTF) have a significant influence on both the energy discharging potential and the buoyancy effect during the solidification mode. In the present study, the impact of the tube position was examined during the discharging process. Liquid-fraction evolution and energy removal rate with thermo-fluid contour profiles were used to examine the performance of the unit. Heat exchanger tubes are proposed with different numbers and positions in the unit for various cases including uniform and non-uniform tubes distribution. The results show that
... Show More