In data transmission a change in single bit in the received data may lead to miss understanding or a disaster. Each bit in the sent information has high priority especially with information such as the address of the receiver. The importance of error detection with each single change is a key issue in data transmission field.
The ordinary single parity detection method can detect odd number of errors efficiently, but fails with even number of errors. Other detection methods such as two-dimensional and checksum showed better results and failed to cope with the increasing number of errors.
Two novel methods were suggested to detect the binary bit change errors when transmitting data in a noisy media.Those methods were: 2D-Checksum method and Modified 2D-Checksum. In 2D-checksum method, summing process was done for 7×7 patterns in row direction and then in column direction to result 8×8 patterns. While in modified method, an additional parity diagonal vector was added to the pattern to be 8×9. By combining the benefits of using single parity (detecting odd number of error bits) and the benefits of checksum (reducing the effect of 4-bit errors) and combining them in 2D shape, the detection process was improved. By contaminating any sample of data with up to 33% of noise (change 0 to 1 and vice versa), the detecting process in first method was improved by approximately 50% compared to the ordinary traditional two dimensional-parity method and gives best detection results in second novel method
Cryptography algorithms play a critical role in information technology against various attacks witnessed in the digital era. Many studies and algorithms are done to achieve security issues for information systems. The high complexity of computational operations characterizes the traditional cryptography algorithms. On the other hand, lightweight algorithms are the way to solve most of the security issues that encounter applying traditional cryptography in constrained devices. However, a symmetric cipher is widely applied for ensuring the security of data communication in constraint devices. In this study, we proposed a hybrid algorithm based on two cryptography algorithms PRESENT and Salsa20. Also, a 2D logistic map of a chaotic system is a
... Show MoreRecently a large number of extensive studies have amassed that describe the removal of dyes from water and wastewater using natural adsorbents and modified materials. Methyl orange dye is found in wastewater streams from various industries that include textiles, plastics, printing and paper among other sources. This article reviews methyl orange adsorption onto natural and modified materials. Despite many techniques available, adsorption stands out for efficient water and wastewater treatment for its ease of operation, flexibility and large-scale removal of colorants. It also has a significant potential for regeneration recovery and recycling of adsorbents in comparison to other water treatment methods. The adsorbents described herein were
... Show MoreThis paper deals with the mathematical method for extracting the Exponential Rayleighh distribution based on mixed between the cumulative distribution function of Exponential distribution and the cumulative distribution function of Rayleigh distribution using an application (maximum), as well as derived different statistical properties for distribution, and present a structure of a new distribution based on a modified weighted version of Azzalini’s (1985) named Modified Weighted Exponential Rayleigh distribution such that this new distribution is generalization of the distribution and provide some special models of the distribution, as well as derived different statistical properties for distribution
In the current study, synthesis and characterization of silver nanoparticles (AgNPs) before and after functionalization with ampicillin antibiotic and their application as anti-pathogenic agents towards bacteria were investigated. AgNPs were synthesized by a green method from AgNO3 solution with glucose subjected to microwave radiation. Characterization of the nanoparticles was conducted using UV-Vis spectroscopy, scanning electron microscopy (SEM), zeta potential determination and Fourier transform infrared (FTIR) spectroscopy. From SEM analysis, the typical silver nanoparticle particle size was found to be 30 nm and Zeta potential measurements gave information about particle stability. Analysis of FTIR patterns and UV-VIS spectroscopy con
... Show MoreIn this research، a comparison has been made between the robust estimators of (M) for the Cubic Smoothing Splines technique، to avoid the problem of abnormality in data or contamination of error، and the traditional estimation method of Cubic Smoothing Splines technique by using two criteria of differentiation which are (MADE، WASE) for different sample sizes and disparity levels to estimate the chronologically different coefficients functions for the balanced longitudinal data which are characterized by observations obtained through (n) from the independent subjects، each one of them is measured repeatedly by group of specific time points (m)،since the frequent measurements within the subjects are almost connected an
... Show MoreThis study aims at defining the concept of the fragile state, a term that came into existence in 2014, when the states that had internal Problems and external interventions were referred to as the failure states. However, the indicators for their designation and the criteria adopted are 12 indicators that address all aspects of the State’s duties vis-a-vis its citizens. The study examined the reasons that led to the continuation of Iraq within the fragile states, and the selection of the five years within the time limits of the study due to the factors that led to the decline of Iraq and falling back within the most fragile countries. The study dealt with the fragile state challenges to the media reality as a result of the change of it
... Show MoreIn this work (paper), we investigate about the robustness of the modified divergence Information Criterion (MDIC), which proposed by Mantalos, Mattheou and Karagrigoriou (2008), to determine the probability of the Criterion picking up the true lag for Autoregressive process, when the error term of this process is normally and Non normally distributed. We obtained the results for different sample sizes by using simulation.
The Asphalt cement is produced as a by-product from the oil industry; the asphalt must practice further processing to control the percentage of its different ingredients so that it will be suitable for paving process. The objective of this work is to prepare different types of modified Asphalt cement using locally available additives, and subjecting the prepared modified Asphalt cement to testing procedures usually adopted for Asphalt cement, and compare the test results with the specification requirements for the modified Asphalt cement to fulfill the paving process requirements. An attempt was made to prepare the modified Asphalt cement for pavement construction in the laboratory by digesting each of the two penetration grade Asphalt c
... Show MoreIn this paper, an algorithm for reconstruction of a completely lost blocks using Modified
Hybrid Transform. The algorithms examined in this paper do not require a DC estimation
method or interpolation. The reconstruction achieved using matrix manipulation based on
Modified Hybrid transform. Also adopted in this paper smart matrix (Detection Matrix) to detect
the missing blocks for the purpose of rebuilding it. We further asses the performance of the
Modified Hybrid Transform in lost block reconstruction application. Also this paper discusses
the effect of using multiwavelet and 3D Radon in lost block reconstruction.