Data generated from the internet and modern applications is extensive and rapidly expanding. So, all modern applications must successfully perform tasks using this massive data. Therefore, one of the significant factors for the success of any application is understanding and extracting meaningful information using digital analytics tools to positively impact the application's performance and deal with challenges that can be encountered. On the other hand, cloud computing is simply an environment comprising a collection of high-performance services from various vendors. These services can frequently access and process massive amounts of data faster than a traditional computer. One of these services is cloud analytics, which appli
... Show MoreThe article emphasizes that 3D stochastic positive linear system with delays is asymptotically stable and depends on the sum of the system matrices and at the same time independent on the values and numbers of the delays. Moreover, the asymptotic stability test of this system with delays can be abridged to the check of its corresponding 2D stochastic positive linear systems without delays. Many theorems were applied to prove that asymptotic stability for 3D stochastic positive linear systems with delays are equivalent to 2D stochastic positive linear systems without delays. The efficiency of the given methods is illustrated on some numerical examples. HIGHLIGHTS Various theorems were applied to prove the asymptoti
... Show MoreInformation is an essential and valuable object in all systems. The more information you have about your issue, the better you can conform to the world around you. Moreover, information recognizes companies and provides influence that helps one company be more effective than another. So, protecting this information using better security controls and providing a high level of access to authorized parties becomes an urgent need. As a result, many algorithms and encryption techniques have been developed to provide a high level of protection for system information. Therefore, this paper presents an enhancement to the Blowfish algorithm as one of the cryptography techniques. Then it proposes an enhancement for increasing efficiency
... Show MoreThe proliferation of many editing programs based on artificial intelligence techniques has contributed to the emergence of deepfake technology. Deepfakes are committed to fabricating and falsifying facts by making a person do actions or say words that he never did or said. So that developing an algorithm for deepfakes detection is very important to discriminate real from fake media. Convolutional neural networks (CNNs) are among the most complex classifiers, but choosing the nature of the data fed to these networks is extremely important. For this reason, we capture fine texture details of input data frames using 16 Gabor filters indifferent directions and then feed them to a binary CNN classifier instead of using the red-green-blue
... Show MoreData generated from modern applications and the internet in healthcare is extensive and rapidly expanding. Therefore, one of the significant success factors for any application is understanding and extracting meaningful information using digital analytics tools. These tools will positively impact the application's performance and handle the challenges that can be faced to create highly consistent, logical, and information-rich summaries. This paper contains three main objectives: First, it provides several analytics methodologies that help to analyze datasets and extract useful information from them as preprocessing steps in any classification model to determine the dataset characteristics. Also, this paper provides a comparative st
... Show MoreCloud computing is a newly developed concept that aims to provide computing resources in the most effective and economical manner. The fundamental idea of cloud computing is to share computing resources among a user group. Cloud computing security is a collection of control-based techniques and strategies that intends to comply with regulatory compliance rules and protect cloud computing-related information, data apps, and infrastructure. On the other hand, data integrity is a guarantee that the digital data are not corrupted, and that only those authorized people can access or modify them (i.e., maintain data consistency, accuracy, and confidence). This review presents an overview of cloud computing concepts, its importance in many
... Show MoreThis work aimed to design and testing of a computer program – based eyeQ improvement, photographic memory enhancement, and speed reading to match the reading speed 150 – 250 word per minute (WPM) with the mind ability of processing and eye snap shooting 5000WPM . The package designed based on Visual Basic 6. The efficiency of the designed program was tested on a 10 persons with different levels of education and ages and the results show an increase in their reading speed of approximately 25% in the first month of training with noticeable enhancement in the memory as well as an increase in the ability to read for longer time without feeling nerves or boring, a nonlinear continuously increase in reading speed is assured after the first mo
... Show More<span lang="EN-US">Increase the in population and kindergarten number, especially in urban areas made it difficult to properly manage waste. Thus, this paper proposed a system dedicated to kindergartens to manage to dispose of waste, the system can be called smart garbage based on internet of things (SGI). To ensure a healthy environment and an intelligent waste in the kindergarten management system in an integrated manner and supported by the internet of things (IoT), we presented it in detail identification, the SGI system includes details like a display system, an automatic lid system, and a communication system. This system supplied capabilities to monitor the status of waste continuously and on IoT website can show the pe
... Show MoreHuman skin detection, which usually performed before image processing, is the method of discovering skin-colored pixels and regions that may be of human faces or limbs in videos or photos. Many computer vision approaches have been developed for skin detection. A skin detector usually transforms a given pixel into a suitable color space and then uses a skin classifier to mark the pixel as a skin or a non-skin pixel. A skin classifier explains the decision boundary of the class of a skin color in the color space based on skin-colored pixels. The purpose of this research is to build a skin detection system that will distinguish between skin and non-skin pixels in colored still pictures. This performed by introducing a metric that measu
... Show MoreImage segmentation using bi-level thresholds works well for straightforward scenarios; however, dealing with complex images that contain multiple objects or colors presents considerable computational difficulties. Multi-level thresholding is crucial for these situations, but it also introduces a challenging optimization problem. This paper presents an improved Reptile Search Algorithm (RSA) that includes a Gbest operator to enhance its performance. The proposed method determines optimal threshold values for both grayscale and color images, utilizing entropy-based objective functions derived from the Otsu and Kapur techniques. Experiments were carried out on 16 benchmark images, which inclu
The main reason for the emergence of a deepfake (deep learning and fake) term is the evolution in artificial intelligence techniques, especially deep learning. Deep learning algorithms, which auto-solve problems when giving large sets of data, are used to swap faces in digital media to create fake media with a realistic appearance. To increase the accuracy of distinguishing a real video from fake one, a new model has been developed based on deep learning and noise residuals. By using Steganalysis Rich Model (SRM) filters, we can gather a low-level noise map that is used as input to a light Convolution neural network (CNN) to classify a real face from fake one. The results of our work show that the training accuracy of the CNN model
... Show MoreBlockchain technology relies on cryptographic techniques that provide various advantages, such as trustworthiness, collaboration, organization, identification, integrity, and transparency. Meanwhile, data analytics refers to the process of utilizing techniques to analyze big data and comprehend the relationships between data points to draw meaningful conclusions. The field of data analytics in Blockchain is relatively new, and few studies have been conducted to examine the challenges involved in Blockchain data analytics. This article presents a systematic analysis of how data analytics affects Blockchain performance, with the aim of investigating the current state of Blockchain-based data analytics techniques in research fields and
... Show More<span>Deepfakes have become possible using artificial intelligence techniques, replacing one person’s face with another person’s face (primarily a public figure), making the latter do or say things he would not have done. Therefore, contributing to a solution for video credibility has become a critical goal that we will address in this paper. Our work exploits the visible artifacts (blur inconsistencies) which are generated by the manipulation process. We analyze focus quality and its ability to detect these artifacts. Focus measure operators in this paper include image Laplacian and image gradient groups, which are very fast to compute and do not need a large dataset for training. The results showed that i) the Laplacian
... Show MoreColor image compression is a good way to encode digital images by decreasing the number of bits wanted to supply the image. The main objective is to reduce storage space, reduce transportation costs and maintain good quality. In current research work, a simple effective methodology is proposed for the purpose of compressing color art digital images and obtaining a low bit rate by compressing the matrix resulting from the scalar quantization process (reducing the number of bits from 24 to 8 bits) using displacement coding and then compressing the remainder using the Mabel ZF algorithm Welch LZW. The proposed methodology maintains the quality of the reconstructed image. Macroscopic and
Deaf and dumb peoples are suffering difficulties most of the time in communicating with society. They use sign language to communicate with each other and with normal people. But Normal people find it more difficult to understand the sign language and gestures made by deaf and dumb people. Therefore, many techniques have been employed to tackle this problem by converting the sign language to a text or a voice and vice versa. In recent years, research has progressed steadily in regard to the use of computers to recognize and translate the sign language. This paper reviews significant projects in the field beginning with important steps of sign language translation. These projects can b