In data transmission a change in single bit in the received data may lead to miss understanding or a disaster. Each bit in the sent information has high priority especially with information such as the address of the receiver. The importance of error detection with each single change is a key issue in data transmission field.
The ordinary single parity detection method can detect odd number of errors efficiently, but fails with even number of errors. Other detection methods such as two-dimensional and checksum showed better results and failed to cope with the increasing number of errors.
Two novel methods were suggested to detect the binary bit change errors when transmitting data in a noisy media.Those methods were: 2D-Checksum method and Modified 2D-Checksum. In 2D-checksum method, summing process was done for 7×7 patterns in row direction and then in column direction to result 8×8 patterns. While in modified method, an additional parity diagonal vector was added to the pattern to be 8×9. By combining the benefits of using single parity (detecting odd number of error bits) and the benefits of checksum (reducing the effect of 4-bit errors) and combining them in 2D shape, the detection process was improved. By contaminating any sample of data with up to 33% of noise (change 0 to 1 and vice versa), the detecting process in first method was improved by approximately 50% compared to the ordinary traditional two dimensional-parity method and gives best detection results in second novel method
In this research , we study the inverse Gompertz distribution (IG) and estimate the survival function of the distribution , and the survival function was evaluated using three methods (the Maximum likelihood, least squares, and percentiles estimators) and choosing the best method estimation ,as it was found that the best method for estimating the survival function is the squares-least method because it has the lowest IMSE and for all sample sizes
Generally, statistical methods are used in various fields of science, especially in the research field, in which Statistical analysis is carried out by adopting several techniques, according to the nature of the study and its objectives. One of these techniques is building statistical models, which is done through regression models. This technique is considered one of the most important statistical methods for studying the relationship between a dependent variable, also called (the response variable) and the other variables, called covariate variables. This research describes the estimation of the partial linear regression model, as well as the estimation of the “missing at random” values (MAR). Regarding the
... Show MoreMerging biometrics with cryptography has become more familiar and a great scientific field was born for researchers. Biometrics adds distinctive property to the security systems, due biometrics is unique and individual features for every person. In this study, a new method is presented for ciphering data based on fingerprint features. This research is done by addressing plaintext message based on positions of extracted minutiae from fingerprint into a generated random text file regardless the size of data. The proposed method can be explained in three scenarios. In the first scenario the message was used inside random text directly at positions of minutiae in the second scenario the message was encrypted with a choosen word before ciphering
... Show MoreBinary relations or interactions among bio-entities, such as proteins, set up the essential part of any living biological system. Protein-protein interactions are usually structured in a graph data structure called "protein-protein interaction networks" (PPINs). Analysis of PPINs into complexes tries to lay out the significant knowledge needed to answer many unresolved questions, including how cells are organized and how proteins work. However, complex detection problems fall under the category of non-deterministic polynomial-time hard (NP-Hard) problems due to their computational complexity. To accommodate such combinatorial explosions, evolutionary algorithms (EAs) are proven effective alternatives to heuristics in solvin
... Show MoreThe aim of this paper is to derive a posteriori error estimates for semilinear parabolic interface problems. More specifically, optimal order a posteriori error analysis in the - norm for semidiscrete semilinear parabolic interface problems is derived by using elliptic reconstruction technique introduced by Makridakis and Nochetto in (2003). A key idea for this technique is the use of error estimators derived for elliptic interface problems to obtain parabolic estimators that are of optimal order in space and time.
model is derived, and the methodology is given in detail. The model is constructed depending on some measurement criteria, Akaike and Bayesian information criterion. For the new time series model, a new algorithm has been generated. The forecasting process, one and two steps ahead, is discussed in detail. Some exploratory data analysis is given in the beginning. The best model is selected based on some criteria; it is compared with some naïve models. The modified model is applied to a monthly chemical sales dataset (January 1992 to Dec 2019), where the dataset in this work has been downloaded from the United States of America census (www.census.gov). Ultimately, the forecasted sales
A simulation study of using 2D tomography to reconstruction a 3D object is presented. The 2D Radon transform is used to create a 2D projection for each slice of the 3D object at different heights. The 2D back-projection and the Fourier slice theorem methods are used to reconstruction each 2D projection slice of the 3D object. The results showed the ability of the Fourier slice theorem method to reconstruct the general shape of the body with its internal structure, unlike the 2D Radon method, which was able to reconstruct the general shape of the body only because of the blurring artefact, Beside that the Fourier slice theorem could not remove all blurring artefact, therefore, this research, suggested the threshold technique to eliminate the
... Show MoreIn this paper, we applied the concept of the error analysis using the linearization method and new condition numbers constituting optimal bounds in appraisals of the possible errors. Evaluations of finite continued fractions, computations of determinates of tridiagonal systems, of determinates of second order and a "fast" complex multiplication. As in Horner's scheme, present rounding error analysis of product and summation algorithms. The error estimates are tested by numerical examples. The executed program for calculation is "MATLAB 7" from the website "Mathworks.com
Wireless lietworking is· constantly improving, changing and
though ba ic principle is the same. ['nstead of using standard cables to transmit information fmm one point to another (qr more), it .uses radio signals. This paper presents .a case study considedng real-time remote
cqntroJ using Wireless UDP/JP-based networks,. The aim of-this werk is to
reduce real-time· remote control system based upon a simulatio.n model,
which can operate via general communication l"]etworks, whieh on bodies. modern wireles tcchnolqgy.
The first part includes· a brief
... Show More