In data transmission a change in single bit in the received data may lead to miss understanding or a disaster. Each bit in the sent information has high priority especially with information such as the address of the receiver. The importance of error detection with each single change is a key issue in data transmission field.
The ordinary single parity detection method can detect odd number of errors efficiently, but fails with even number of errors. Other detection methods such as two-dimensional and checksum showed better results and failed to cope with the increasing number of errors.
Two novel methods were suggested to detect the binary bit change errors when transmitting data in a noisy media.Those methods were: 2D-Checksum method and Modified 2D-Checksum. In 2D-checksum method, summing process was done for 7×7 patterns in row direction and then in column direction to result 8×8 patterns. While in modified method, an additional parity diagonal vector was added to the pattern to be 8×9. By combining the benefits of using single parity (detecting odd number of error bits) and the benefits of checksum (reducing the effect of 4-bit errors) and combining them in 2D shape, the detection process was improved. By contaminating any sample of data with up to 33% of noise (change 0 to 1 and vice versa), the detecting process in first method was improved by approximately 50% compared to the ordinary traditional two dimensional-parity method and gives best detection results in second novel method
Estimation of the unknown parameters in 2-D sinusoidal signal model can be considered as important and difficult problem. Due to the difficulty to find estimate of all the parameters of this type of models at the same time, we propose sequential non-liner least squares method and sequential robust M method after their development through the use of sequential approach in the estimate suggested by Prasad et al to estimate unknown frequencies and amplitudes for the 2-D sinusoidal compounds but depending on Downhill Simplex Algorithm in solving non-linear equations for the purpose of obtaining non-linear parameters estimation which represents frequencies and then use of least squares formula to estimate
... Show MoreOne of the most significant environmental issues facing the planet today is air pollution. Due to development in industry and population density, air pollution has lately gotten worse. Like many developing nations, Iraq suffers from air pollution, particularly in its urban areas with heavy industry. Our research was carried out in Baghdad's Al-Nahrawan neighbourhood. Recently, ground surveys and remote sensing were used to study the monitoring of air pollution. In order to extract different gaseous and particle data, Earth Data source, Google Earth Engine (GEE), and Geographic Information Systems (GIS) software were all employed. The findings demonstrated that there is a significant positive connection between data collected by ground-ba
... Show MoreA new, simple and sensitive method was used forevaluation of propranolol withphosphotungstic acidto prove the efficiency, reliability and repeatability of the long distance chasing photometer (NAG-ADF-300-2) using continuous flow injection analysis. The method is based on reaction between propranolol and phosphotungstic acid in an aqueous medium to obtain a yellow precipitate. Optimum parameters was studied to increase the sensitivity for developed method. A linear range for calibration graph was 0.007-13 mmol/L for cell A and 5-15 mmol/L for cell B, and LOD 207.4792 ng/160 µL and 1.2449 µg/160 µL respectively to cell A and cell B with correlation coefficient (r) 0.9988 for cell A, 0.9996 for cell B, RSD% was lower than 1%, (n=8) for the
... Show MoreDesign of experiments (DOE) was made by Minitab software for the study of three factors used in the precipitation process of the Sodium Aluminate solution prepared from digestion of α-Al2O3 to determine the optimum conditions to a produce Boehmite which is used in production of ɤ-Al2O3 during drying and calcination processes, the factors are; the temperature of the sodium aluminate solution, concentration of HCl acid added for the precipitation and the pH of the solution at which the precipitation was ended. The design of the experiments leads to 18 experiments.
The results show that the optimum conditions for the precipitation of the sodium aluminate solution which
... Show MoreIn this work , an effective procedure of Box-Behnken based-ANN (Artificial Neural Network) and GA (Genetic Algorithm) has been utilized for finding the optimum conditions of wt.% of doping elements (Ce,Y, and Ge) doped-aluminizing-chromizing of Incoloy 800H . ANN and Box-Behnken design method have been implanted for minimizing hot corrosion rate kp (10-12g2.cm-4.s-1) in Incoloy 800H at 900oC . ANN was used for estimating the predicted values of hot corrosion rate kp (10-12g2.cm-4.s-1) . The optimal wt.% of doping elements combination to obtain minimum hot corrosion rate was calculated using genetic alg
... Show MoreA new simultaneous spectrophotometric-kinetic method was developed to determine phenylephrine (PHEN) and tetracycline (TETR) via H-point standard addition method (HPSAM). The proposed procedures rely on the measurements of the difference in the rate of charge-transfer (CT) reaction between each of PHEN and TETR as electron donors with p-Bromanil (p-Br) as an electron acceptor. Different experimental factors which affect the extent of the complex formation were investigated by monitoring the value of absorbance at 446 nm. Time pair of 50 -100 sec was selected and employed, among different examined pairs since it results in the highest accuracy for HPSAM-plot. Linear calibration graphs in the concentration ranges of 10.0-40.0 and 10.0–50.0
... Show MoreIn this research want to make analysis for some indicators and it's classifications that related with the teaching process and the scientific level for graduate studies in the university by using analysis of variance for ranked data for repeated measurements instead of the ordinary analysis of variance . We reach many conclusions for the
important classifications for each indicator that has affected on the teaching process. &nb
... Show MoreThe aim of this research is to use robust technique by trimming, as the analysis of maximum likelihood (ML) often fails in the case of outliers in the studied phenomenon. Where the (MLE) will lose its advantages because of the bad influence caused by the Outliers. In order to address this problem, new statistical methods have been developed so as not to be affected by the outliers. These methods have robustness or resistance. Therefore, maximum trimmed likelihood: (MTL) is a good alternative to achieve more results. Acceptability and analogies, but weights can be used to increase the efficiency of the resulting capacities and to increase the strength of the estimate using the maximum weighted trimmed likelihood (MWTL). In order to perform t
... Show More