In data transmission a change in single bit in the received data may lead to miss understanding or a disaster. Each bit in the sent information has high priority especially with information such as the address of the receiver. The importance of error detection with each single change is a key issue in data transmission field.
The ordinary single parity detection method can detect odd number of errors efficiently, but fails with even number of errors. Other detection methods such as two-dimensional and checksum showed better results and failed to cope with the increasing number of errors.
Two novel methods were suggested to detect the binary bit change errors when transmitting data in a noisy media.Those methods were: 2D-Checksum method and Modified 2D-Checksum. In 2D-checksum method, summing process was done for 7×7 patterns in row direction and then in column direction to result 8×8 patterns. While in modified method, an additional parity diagonal vector was added to the pattern to be 8×9. By combining the benefits of using single parity (detecting odd number of error bits) and the benefits of checksum (reducing the effect of 4-bit errors) and combining them in 2D shape, the detection process was improved. By contaminating any sample of data with up to 33% of noise (change 0 to 1 and vice versa), the detecting process in first method was improved by approximately 50% compared to the ordinary traditional two dimensional-parity method and gives best detection results in second novel method
This article is an endeavour to highlight the relationship between social media and language evolution. It reviews the current theoretical efforts on communication and language change. The descriptive design, which is theoretically based on technological determision, is used. The assumption behind this review is that the social media plays a significant role in language evolution. Moreover, different platforms of social media are characterized by being the easiest and fastest means of communication. It concludes that the current theoretical efforts have paid much attention to the relationship between social media and language evolution. Such efforts have highlighted the fact that social media platforms are awash with a lot of acronyms, cybe
... Show More: translating acronyms of media and international (& world) organizations helps the researcher to draw the following conclusions:1- Acronyms of world news agencies can be translated into Arabic in three ways: by indicating the lexical meaning; by indicating English abbreviated form as letter by letter & by indicating Arabic abbreviated form as a word. 2- Acronyms of world satellite TV's can be translated into Arabic in two ways: by indicating the lexical meaning & by indicating English abbreviated form as letter by letter. 3- Acronyms of world newspapers can be translated into Arabic in two ways: by indicating both the lexical meaning & Arabic transliteration of the English form. 4- Acronyms of U.N. & world organizati
... Show MoreIt must be emphasized that media is amongst human studies fusing older and more recent sciences together, and that its disclosures are the physics of the new communication. Michio Kaku, a theoretical physicist, in his book “ Visions”, confirms this fact when he says :” As a research physicist, I believe that physicists have been particularly successful at predicting the broad outlines of the future .Professionally, I work in one of the most fundamental areas of physics, the quest to complete Einstein's dream of a "theory of everything." As a result, I am constantly reminded of the ways in which quantum physics touches many of the key discoveries that shaped the twentieth century. “ He then got to the fact that the physical disclo
... Show MoreThis study investigates the impact of spatial resolution enhancement on supervised classification accuracy using Landsat 9 satellite imagery, achieved through pan-sharpening techniques leveraging Sentinel-2 data. Various methods were employed to synthesize a panchromatic (PAN) band from Sentinel-2 data, including dimension reduction algorithms and weighted averages based on correlation coefficients and standard deviation. Three pan-sharpening algorithms (Gram-Schmidt, Principal Components Analysis, Nearest Neighbour Diffusion) were employed, and their efficacy was assessed using seven fidelity criteria. Classification tasks were performed utilizing Support Vector Machine and Maximum Likelihood algorithms. Results reveal that specifi
... Show MoreA simple, economic, rapid, reliable, and stability-indicating high-performance liquid chromatography (HPLC) method has been developed and validated for the simultaneous determination of paracetamol (PCM) and caffeine (CF) in solid dosage form. The chromatographic separations were achieved with a Waters Symmetry® C18 column (5 μm, 4.6 × 150 mm), using a mixture of methanol and water (40:60, v/v) as a mobile phase, under isocratic elution mode with a flow rate of 0.8 mL/min, and ultraviolet (UV) detection was set at 264 nm. The oven temperature for the column was set and maintained at 35 °C. The method was validated according to International Conference on Harmonization (ICH) guidelines, and it demonstrated excellent linearity, wi
... Show Moren this research, several estimators concerning the estimation are introduced. These estimators are closely related to the hazard function by using one of the nonparametric methods namely the kernel function for censored data type with varying bandwidth and kernel boundary. Two types of bandwidth are used: local bandwidth and global bandwidth. Moreover, four types of boundary kernel are used namely: Rectangle, Epanechnikov, Biquadratic and Triquadratic and the proposed function was employed with all kernel functions. Two different simulation techniques are also used for two experiments to compare these estimators. In most of the cases, the results have proved that the local bandwidth is the best for all the types of the kernel boundary func
... Show MoreMeasuring the efficiency of postgraduate and undergraduate programs is one of the essential elements in educational process. In this study, colleges of Baghdad University and data for the academic year (2011-2012) have been chosen to measure the relative efficiencies of postgraduate and undergraduate programs in terms of their inputs and outputs. A relevant method to conduct the analysis of this data is Data Envelopment Analysis (DEA). The effect of academic staff to the number of enrolled and alumni students to the postgraduate and undergraduate programs are the main focus of the study.
The region-based association analysis has been proposed to capture the collective behavior of sets of variants by testing the association of each set instead of individual variants with the disease. Such an analysis typically involves a list of unphased multiple-locus genotypes with potentially sparse frequencies in cases and controls. To tackle the problem of the sparse distribution, a two-stage approach was proposed in literature: In the first stage, haplotypes are computationally inferred from genotypes, followed by a haplotype coclassification. In the second stage, the association analysis is performed on the inferred haplotype groups. If a haplotype is unevenly distributed between the case and control samples, this haplotype is labeled
... Show MoreIn this research, the Williamson-Hall method and of size-strain plot method was employed to analyze X- ray lines for evaluating the crystallite size and lattice strain and of cadmium oxide nanoparticles. the crystallite size value is (15.2 nm) and (93.1 nm) and lattice strain (4.2 x10−4 ) and (21x10−4) respectively. Also, other methods have been employed to evaluate the crystallite size. The current methods are (Sherrer and modified Sherrer methods ) and their results are (14.8 nm) and (13.9nm) respectively. Each method of analysis has a different result because the alteration in the crystallite size and lattice strain calculated according to the Williamson-Hall and size-strain plot methods shows that the non-uniform strain in nan
... Show MoreUnconfined Compressive Strength is considered the most important parameter of rock strength properties affecting the rock failure criteria. Various research have developed rock strength for specific lithology to estimate high-accuracy value without a core. Previous analyses did not account for the formation's numerous lithologies and interbedded layers. The main aim of the present study is to select the suitable correlation to predict the UCS for hole depth of formation without separating the lithology. Furthermore, the second aim is to detect an adequate input parameter among set wireline to determine the UCS by using data of three wells along ten formations (Tanuma, Khasib, Mishrif, Rumaila, Ahmady, Maudud, Nahr Um
... Show More