Data generated from the internet and modern applications is extensive and rapidly expanding. So, all modern applications must successfully perform tasks using this massive data. Therefore, one of the significant factors for the success of any application is understanding and extracting meaningful information using digital analytics tools to positively impact the application's performance and deal with challenges that can be encountered. On the other hand, cloud computing is simply an environment comprising a collection of high-performance services from various vendors. These services can frequently access and process massive amounts of data faster than a traditional computer. One of these services is cloud analytics, which appli
... Show MoreBlockchain technology relies on cryptographic techniques that provide various advantages, such as trustworthiness, collaboration, organization, identification, integrity, and transparency. Meanwhile, data analytics refers to the process of utilizing techniques to analyze big data and comprehend the relationships between data points to draw meaningful conclusions. The field of data analytics in Blockchain is relatively new, and few studies have been conducted to examine the challenges involved in Blockchain data analytics. This article presents a systematic analysis of how data analytics affects Blockchain performance, with the aim of investigating the current state of Blockchain-based data analytics techniques in research fields and
... Show More<span lang="EN-US">Increase the in population and kindergarten number, especially in urban areas made it difficult to properly manage waste. Thus, this paper proposed a system dedicated to kindergartens to manage to dispose of waste, the system can be called smart garbage based on internet of things (SGI). To ensure a healthy environment and an intelligent waste in the kindergarten management system in an integrated manner and supported by the internet of things (IoT), we presented it in detail identification, the SGI system includes details like a display system, an automatic lid system, and a communication system. This system supplied capabilities to monitor the status of waste continuously and on IoT website can show the pe
... Show MorePathology reports are necessary for specialists to make an appropriate diagnosis of diseases in general and blood diseases in particular. Therefore, specialists check blood cells and other blood details. Thus, to diagnose a disease, specialists must analyze the factors of the patient’s blood and medical history. Generally, doctors have tended to use intelligent agents to help them with CBC analysis. However, these agents need analytical tools to extract the parameters (CBC parameters) employed in the prediction of the development of life-threatening bacteremia and offer prognostic data. Therefore, this paper proposes an enhancement to the Rabin–Karp algorithm and then mixes it with the fuzzy ratio to make this algorithm suitable
... Show MoreBig data of different types, such as texts and images, are rapidly generated from the internet and other applications. Dealing with this data using traditional methods is not practical since it is available in various sizes, types, and processing speed requirements. Therefore, data analytics has become an important tool because only meaningful information is analyzed and extracted, which makes it essential for big data applications to analyze and extract useful information. This paper presents several innovative methods that use data analytics techniques to improve the analysis process and data management. Furthermore, this paper discusses how the revolution of data analytics based on artificial intelligence algorithms might provide
... Show MoreHuman skin detection, which usually performed before image processing, is the method of discovering skin-colored pixels and regions that may be of human faces or limbs in videos or photos. Many computer vision approaches have been developed for skin detection. A skin detector usually transforms a given pixel into a suitable color space and then uses a skin classifier to mark the pixel as a skin or a non-skin pixel. A skin classifier explains the decision boundary of the class of a skin color in the color space based on skin-colored pixels. The purpose of this research is to build a skin detection system that will distinguish between skin and non-skin pixels in colored still pictures. This performed by introducing a metric that measu
... Show MoreInformation is an essential and valuable object in all systems. The more information you have about your issue, the better you can conform to the world around you. Moreover, information recognizes companies and provides influence that helps one company be more effective than another. So, protecting this information using better security controls and providing a high level of access to authorized parties becomes an urgent need. As a result, many algorithms and encryption techniques have been developed to provide a high level of protection for system information. Therefore, this paper presents an enhancement to the Blowfish algorithm as one of the cryptography techniques. Then it proposes an enhancement for increasing efficiency
... Show MoreThis work aimed to design and testing of a computer program – based eyeQ improvement, photographic memory enhancement, and speed reading to match the reading speed 150 – 250 word per minute (WPM) with the mind ability of processing and eye snap shooting 5000WPM . The package designed based on Visual Basic 6. The efficiency of the designed program was tested on a 10 persons with different levels of education and ages and the results show an increase in their reading speed of approximately 25% in the first month of training with noticeable enhancement in the memory as well as an increase in the ability to read for longer time without feeling nerves or boring, a nonlinear continuously increase in reading speed is assured after the first mo
... Show MoreData generated from modern applications and the internet in healthcare is extensive and rapidly expanding. Therefore, one of the significant success factors for any application is understanding and extracting meaningful information using digital analytics tools. These tools will positively impact the application's performance and handle the challenges that can be faced to create highly consistent, logical, and information-rich summaries. This paper contains three main objectives: First, it provides several analytics methodologies that help to analyze datasets and extract useful information from them as preprocessing steps in any classification model to determine the dataset characteristics. Also, this paper provides a comparative st
... Show MoreColor image compression is a good way to encode digital images by decreasing the number of bits wanted to supply the image. The main objective is to reduce storage space, reduce transportation costs and maintain good quality. In current research work, a simple effective methodology is proposed for the purpose of compressing color art digital images and obtaining a low bit rate by compressing the matrix resulting from the scalar quantization process (reducing the number of bits from 24 to 8 bits) using displacement coding and then compressing the remainder using the Mabel ZF algorithm Welch LZW. The proposed methodology maintains the quality of the reconstructed image. Macroscopic and
Cloud computing is a newly developed concept that aims to provide computing resources in the most effective and economical manner. The fundamental idea of cloud computing is to share computing resources among a user group. Cloud computing security is a collection of control-based techniques and strategies that intends to comply with regulatory compliance rules and protect cloud computing-related information, data apps, and infrastructure. On the other hand, data integrity is a guarantee that the digital data are not corrupted, and that only those authorized people can access or modify them (i.e., maintain data consistency, accuracy, and confidence). This review presents an overview of cloud computing concepts, its importance in many
... Show MoreThe proliferation of many editing programs based on artificial intelligence techniques has contributed to the emergence of deepfake technology. Deepfakes are committed to fabricating and falsifying facts by making a person do actions or say words that he never did or said. So that developing an algorithm for deepfakes detection is very important to discriminate real from fake media. Convolutional neural networks (CNNs) are among the most complex classifiers, but choosing the nature of the data fed to these networks is extremely important. For this reason, we capture fine texture details of input data frames using 16 Gabor filters indifferent directions and then feed them to a binary CNN classifier instead of using the red-green-blue
... Show MoreArtificial intelligence techniques are reaching us in several forms, some of which are useful but can be exploited in a way that harms us. One of these forms is called deepfakes. Deepfakes is used to completely modify video (or image) content to display something that was not in it originally. The danger of deepfake technology impact on society through the loss of confidence in everything is published. Therefore, in this paper, we focus on deepfakedetection technology from the view of two concepts which are deep learning and forensic tools. The purpose of this survey is to give the reader a deeper overview of i) the environment of deepfake creation and detection, ii) how deep learning and forensic tools contributed to the detection
... Show More<span>Deepfakes have become possible using artificial intelligence techniques, replacing one person’s face with another person’s face (primarily a public figure), making the latter do or say things he would not have done. Therefore, contributing to a solution for video credibility has become a critical goal that we will address in this paper. Our work exploits the visible artifacts (blur inconsistencies) which are generated by the manipulation process. We analyze focus quality and its ability to detect these artifacts. Focus measure operators in this paper include image Laplacian and image gradient groups, which are very fast to compute and do not need a large dataset for training. The results showed that i) the Laplacian
... Show MoreThe article emphasizes that 3D stochastic positive linear system with delays is asymptotically stable and depends on the sum of the system matrices and at the same time independent on the values and numbers of the delays. Moreover, the asymptotic stability test of this system with delays can be abridged to the check of its corresponding 2D stochastic positive linear systems without delays. Many theorems were applied to prove that asymptotic stability for 3D stochastic positive linear systems with delays are equivalent to 2D stochastic positive linear systems without delays. The efficiency of the given methods is illustrated on some numerical examples. HIGHLIGHTS Various theorems were applied to prove the asymptoti
... Show MoreDeaf and dumb peoples are suffering difficulties most of the time in communicating with society. They use sign language to communicate with each other and with normal people. But Normal people find it more difficult to understand the sign language and gestures made by deaf and dumb people. Therefore, many techniques have been employed to tackle this problem by converting the sign language to a text or a voice and vice versa. In recent years, research has progressed steadily in regard to the use of computers to recognize and translate the sign language. This paper reviews significant projects in the field beginning with important steps of sign language translation. These projects can b