In data transmission a change in single bit in the received data may lead to miss understanding or a disaster. Each bit in the sent information has high priority especially with information such as the address of the receiver. The importance of error detection with each single change is a key issue in data transmission field.
The ordinary single parity detection method can detect odd number of errors efficiently, but fails with even number of errors. Other detection methods such as two-dimensional and checksum showed better results and failed to cope with the increasing number of errors.
Two novel methods were suggested to detect the binary bit change errors when transmitting data in a noisy media.Those methods were: 2D-Checksum method and Modified 2D-Checksum. In 2D-checksum method, summing process was done for 7×7 patterns in row direction and then in column direction to result 8×8 patterns. While in modified method, an additional parity diagonal vector was added to the pattern to be 8×9. By combining the benefits of using single parity (detecting odd number of error bits) and the benefits of checksum (reducing the effect of 4-bit errors) and combining them in 2D shape, the detection process was improved. By contaminating any sample of data with up to 33% of noise (change 0 to 1 and vice versa), the detecting process in first method was improved by approximately 50% compared to the ordinary traditional two dimensional-parity method and gives best detection results in second novel method
Document source identification in printer forensics involves determining the origin of a printed document based on characteristics such as the printer model, serial number, defects, or unique printing artifacts. This process is crucial in forensic investigations, particularly in cases involving counterfeit documents or unauthorized printing. However, consistent pattern identification across various printer types remains challenging, especially when efforts are made to alter printer-generated artifacts. Machine learning models are often used in these tasks, but selecting discriminative features while minimizing noise is essential. Traditional KNN classifiers require a careful selection of distance metrics to capture relevant printing
... Show MoreThe removal of turbidity from produced water by chemical coagulation/flocculation method using locally available coagulants was investigated. Aluminum sulfate (alum) is selected as a primary coagulant, while calcium hydroxide (lime) is used as a coagulant aid. The performance of these coagulants was studied through jar test by comparing turbidity removal at different coagulant/ coagulants aid ratio, coagulant dose, water pH, and sedimentation time. In addition, an attempt has been made to examine the relationship between turbidity (NTU) and total suspended solids (mg/L) on the same samples of produced water. The best conditions for turbidity removal can be obtained at 75% alum+25% lime coagulant at coagulant dose of 80 m
... Show MoreThe current study focuses on utilizing artificial intelligence (AI) techniques to identify the optimal locations of production wells and types for achieving the production company’s primary objective, which is to increase oil production from the Sa’di carbonate reservoir of the Halfaya oil field in southeast Iraq, with the determination of the optimal scenario of various designs for production wells, which include vertical, horizontal, multi-horizontal, and fishbone lateral wells, for all reservoir production layers. Artificial neural network tool was used to identify the optimal locations for obtaining the highest production from the reservoir layers and the optimal well type. Fo
The objective of this work is to study the influence of end milling cutting process parameters, tool material and geometry on multi-response outputs for 4032 Al-alloy. This can be done by proposing an approach that combines Taguchi method with grey relational analysis. Three cutting parameters have been selected (spindle speed, feed rate and cut depth) with three levels for each parameter. Three tools with different materials and geometry have been also used to design the experimental tests and runs based on matrix L9. The end milling process with several output characteristics is solved using a grey relational analysis. The results of analysis of variance (ANOVA) showed that the major influencing parameters on multi-objective response w
... Show MoreThe co-occurrence of metabolic syndrome with type 2 diabetes mellitus (T2DM) will potentiate the morbidity and mortality that may be associated with each case. Fasting triglycerides-glucose index (TyG index) has been recommended as a useful marker to predict metabolic syndrome. Our study aimed to introduce gender-specific cut-off values of triglycerides- glucose index for diagnosing metabolic syndrome associated with type 2 diabetes mellitus. The data were collected from Baghdad hospitals between May - December 2019. The number of eligible participants was 424. National cholesterol education program, Adult Treatment Panel III criteria were used to define metabolic syndrome. Measurement of fasting blood glucose, lipid pro
... Show MoreThis work was conducted to study the ability of locally prepared Zeolite NaY for the reduction of sulfur compounds from Iraqi natural gas by a continuous mode adsorption unit. Zeolite Y was hydrothermally synthesized using abundant kaolin clay as aluminum precursor. Characterization was made using chemical analysis, XRD and BET surface area. Results of the adsorption experiments showed that zeolite Y is an active adsorbent for removal H2S from natural gas and other gas streams. The effect of temperature was found inversely related to the removal efficiency. Increasing bed height was found to increase the removal efficiency at constant flow rate of natural gas. The adsorption capacity was evaluated and its maximum uptake was 5.345 mg H2S/g z
... Show MoreThe current study investigated the stability and the extraction efficiency of emulsion liquid membrane (ELM) for Abamectin pesticide removal from aqueous solution. The stability was investigated in terms of droplet emulsion size distribution and emulsion breakage percent. The proposed ELM included a mixture of corn oil and kerosene (1:1) as a diluent, Span 80 (sorbitan monooleate) as a surfactant and hydrochloric acid (HCl) as a stripping agent without utilizing a carrier agent. Parameters such as homogenizer speed, surfactant concentration, emulsification time and internal to organic volume ratio (I/O) were evaluated. Results show that the lower droplet size of 0.9 µm and higher stable emulsion in terms of breakage percent of 1.12 % were
... Show MoreEncryption of data is translating data to another shape or symbol which enables people only with an access to the secret key or a password that can read it. The data which are encrypted are generally referred to as cipher text, while data which are unencrypted are known plain text. Entropy can be used as a measure which gives the number of bits that are needed for coding the data of an image. As the values of pixel within an image are dispensed through further gray-levels, the entropy increases. The aim of this research is to compare between CAST-128 with proposed adaptive key and RSA encryption methods for video frames to determine the more accurate method with highest entropy. The first method is achieved by applying the "CAST-128" and
... Show More