Today in the digital realm, where images constitute the massive resource of the social media base but unfortunately suffer from two issues of size and transmission, compression is the ideal solution. Pixel base techniques are one of the modern spatially optimized modeling techniques of deterministic and probabilistic bases that imply mean, index, and residual. This paper introduces adaptive pixel-based coding techniques for the probabilistic part of a lossy scheme by incorporating the MMSA of the C321 base along with the utilization of the deterministic part losslessly. The tested results achieved higher size reduction performance compared to the traditional pixel-based techniques and the standard JPEG by about 40% and 50%, respectively, with pleasing quality exceeding 45 dB.
This article deals with the approximate algorithm for two dimensional multi-space fractional bioheat equations (M-SFBHE). The application of the collection method will be expanding for presenting a numerical technique for solving M-SFBHE based on “shifted Jacobi-Gauss-Labatto polynomials” (SJ-GL-Ps) in the matrix form. The Caputo formula has been utilized to approximate the fractional derivative and to demonstrate its usefulness and accuracy, the proposed methodology was applied in two examples. The numerical results revealed that the used approach is very effective and gives high accuracy and good convergence.
The existence of the Internet, networking, and cloud computing support a wide range of new technologies. Blockchain is one of these technologies; this increases the interest of researchers who are concerned with providing a safe environment for the circulation of important information via the Internet. Maintaining solidity and integrity of a blockchain’s transactions is an important issue, which must always be borne in mind. Transactions in blockchain are based on use of public and private keys asymmetric cryptography. This work proposes usage of users’ DNA as a supporting technology for storing and recovering their keys in case those keys are lost — as an effective bio-cryptographic recovery method. The RSA private key is
... Show MoreThis article proposes a new strategy based on a hybrid method that combines the gravitational search algorithm (GSA) with the bat algorithm (BAT) to solve a single-objective optimization problem. It first runs GSA, followed by BAT as the second step. The proposed approach relies on a parameter between 0 and 1 to address the problem of falling into local research because the lack of a local search mechanism increases intensity search, whereas diversity remains high and easily falls into the local optimum. The improvement is equivalent to the speed of the original BAT. Access speed is increased for the best solution. All solutions in the population are updated before the end of the operation of the proposed algorithm. The diversification f
... Show MorePhotonic crystal fiber interferometers are widely used for sensing applications. In this work, solid core-Photonic crystal fiber based on Mach-Zehnder modal interferometer for sensing refractive index was presented. The general structure of sensor applied by splicing short lengths of PCF in both sides with conventional single mode fiber (SMF-28). To apply modal interferometer theory; collapsing technique based on fusion splicing used to excite higher order modes (LP01 and LP11). Laser diode (1550 nm) has been used as a pump light source. Where a high sensitive optical spectrum analyzer (OSA) was used to monitor and record the transmitted. The experimental work shows that the interference spectrum of Photonic crystal fiber interferometer
... Show MoreSo far, APT (Advanced Persistent Threats) is a constant concern for information security. Despite that, many approaches have been used in order to detect APT attacks, such as change controlling, sandboxing and network traffic analysis. However, success of 100% couldn’t be achieved. Current studies have illustrated that APTs adopt many complex techniques to evade all detection types. This paper describes and analyzes APT problems by analyzing the most common techniques, tools and pathways used by attackers. In addition, it highlights the weaknesses and strengths of the existing security solutions that have been used since the threat was identified in 2006 until 2019. Furthermore, this research proposes a new framework that can be u
... Show MoreThe conventional procedures of clustering algorithms are incapable of overcoming the difficulty of managing and analyzing the rapid growth of generated data from different sources. Using the concept of parallel clustering is one of the robust solutions to this problem. Apache Hadoop architecture is one of the assortment ecosystems that provide the capability to store and process the data in a distributed and parallel fashion. In this paper, a parallel model is designed to process the k-means clustering algorithm in the Apache Hadoop ecosystem by connecting three nodes, one is for server (name) nodes and the other two are for clients (data) nodes. The aim is to speed up the time of managing the massive sc
... Show More