Big data of different types, such as texts and images, are rapidly generated from the internet and other applications. Dealing with this data using traditional methods is not practical since it is available in various sizes, types, and processing speed requirements. Therefore, data analytics has become an important tool because only meaningful information is analyzed and extracted, which makes it essential for big data applications to analyze and extract useful information. This paper presents several innovative methods that use data analytics techniques to improve the analysis process and data management. Furthermore, this paper discusses how the revolution of data analytics based on artificial intelligence algorithms might provide improvements for many applications. In addition, critical challenges and research issues were provided based on published paper limitations to help researchers distinguish between various analytics techniques to develop highly consistent, logical, and information-rich analyses based on valuable features. Furthermore, the findings of this paper may be used to identify the best methods in each sector used in these publications, assist future researchers in their studies for more systematic and comprehensive analysis and identify areas for developing a unique or hybrid technique for data analysis.
This study was conducted to evaluate the efficacy of different techniques for extraction and purification of Tomato yellow leaf curl virus (TYLCV). An isolate of the virus free of possible contamination with other viruses infecting the same host and transmitted by the same vector Bemisia tabaci Genn. was obtained. This was realized by indicator plants and incubation period in the vector. Results obtained revealed that the virus infect Nicotiana glutinosa without visible symptoms, while Nicotiana tabaccum var. White Burley was not susceptible to the virus. The incubation period of the virus in the vector was found to be 21 hrs. These results indicate that the virus is TYLCV. Results showed that Butanol was more effective in clarification the
... Show MoreThe normalized difference vegetation index (NDVI) is an effective graphical indicator that can be used to analyze remote sensing measurements using a space platform, in order to investigate the trend of the live green vegetation in the observed target. In this research, the change detection of vegetation in Babylon city was done by tracing the NDVI factor for temporal Landsat satellite images. These images were used and utilized in two different terms: in March 19th in 2015 and March 5th in 2020. The Arc-GIS program ver. 10.7 was adopted to analyze the collected data. The final results indicate a spatial variation in the (NDVI), where it increases from (1666.91 𝑘𝑚2) in 2015 to (1697.01 𝑘𝑚2)) in 2020 between the t
... Show MoreKE Sharquie, AA Noaimi, HA Al-Mudaris, Journal of Cosmetics, Dermatological Sciences and Applications, 2013 - Cited by 4
Background: The marginal seal is essential for sealant success because penetration of bacteria under the sealant might allow caries onset or progression. The aim of the present study was to estimate and compare the microleakage of pit and fissure sealant after various methods of occlusal surface preparation. Materials and methods: Thirty non-carious premolars extracted for orthodontic reasons were equally divided into three groups. In group one, occlusal fissures were opened with round carbide bur, in group two, occlusal surfaces of the teeth were cleaned with a dry pointed bristle brush and samples of group three were cleaned with a slurry of fine flour of pumice in water using rubber cup. Then fissures of all teeth were etched using 35% p
... Show MoreImpressed current cathodic protection controlled by computer gives the ideal solution to the changes in environmental factors and long term coating degradation. The protection potential distribution achieved and the current demand on the anode can be regulated to protection criteria, to achieve the effective protection for the system.
In this paper, cathodic protection problem of above ground steel storage tank was investigated by an impressed current of cathodic protection with controlled potential of electrical system to manage the variation in soil resistivity. Corrosion controller has been implemented for above ground tank in LabView where tank's bottom potential to soil was manipulated to the desired set poi
... Show MoreCloud storage provides scalable and low cost resources featuring economies of scale based on cross-user architecture. As the amount of data outsourced grows explosively, data deduplication, a technique that eliminates data redundancy, becomes essential. The most important cloud service is data storage. In order to protect the privacy of data owner, data are stored in cloud in an encrypted form. However, encrypted data introduce new challenges for cloud data deduplication, which becomes crucial for data storage. Traditional deduplication schemes cannot work on encrypted data. Existing solutions of encrypted data deduplication suffer from security weakness. This paper proposes a combined compressive sensing and video deduplication to maximize
... Show MoreThe transmitting and receiving of data consume the most resources in Wireless Sensor Networks (WSNs). The energy supplied by the battery is the most important resource impacting WSN's lifespan in the sensor node. Therefore, because sensor nodes run from their limited battery, energy-saving is necessary. Data aggregation can be defined as a procedure applied for the elimination of redundant transmissions, and it provides fused information to the base stations, which in turn improves the energy effectiveness and increases the lifespan of energy-constrained WSNs. In this paper, a Perceptually Important Points Based Data Aggregation (PIP-DA) method for Wireless Sensor Networks is suggested to reduce redundant data before sending them to the
... Show MoreA skip list data structure is really just a simulation of a binary search tree. Skip lists algorithm are simpler, faster and use less space. this data structure conceptually uses parallel sorted linked lists. Searching in a skip list is more difficult than searching in a regular sorted linked list. Because a skip list is a two dimensional data structure, it is implemented using a two dimensional network of nodes with four pointers. the implementation of the search, insert and delete operation taking a time of upto . The skip list could be modified to implement the order statistic operations of RANKand SEARCH BY RANK while maintaining the same expected time. Keywords:skip list , parallel linked list , randomized algorithm , rank.