Ensuring reliable data transmission in Network on Chip (NoC) is one of the most challenging tasks, especially in noisy environments. As crosstalk, interference, and radiation were increased with manufacturers' increasing tendency to reduce the area, increase the frequencies, and reduce the voltages. So many Error Control Codes (ECC) were proposed with different error detection and correction capacities and various degrees of complexity. Code with Crosstalk Avoidance and Error Correction (CCAEC) for network-on-chip interconnects uses simple parity check bits as the main technique to get high error correction capacity. Per this work, this coding scheme corrects up to 12 random errors, representing a high correction capacity compared with many other code schemes. This candidate has high correction capability but with a high codeword size. In this work, the CCAEC code is compared to another well-known code scheme called Horizontal-Vertical-Diagonal (HVD) error detecting and correcting code through reliability analysis by deriving a new accurate mathematical model for the probability of residual error Pres for both code schemes and confirming it by simulation results for both schemes. The results showed that the HVD code could correct all single, double, and triple errors and failed to correct only 3.3 % of states of quadric errors. In comparison, the CCAEC code can correct a single error and fails in 1.5%, 7.2%, and 16.4% cases of double, triple, and quadric errors, respectively. As a result, the HVD has better reliability than CCAEC and has lower overhead; making it a promising coding scheme to handle the reliability issues for NoC.
This research deals with the use of a number of statistical methods, such as the kernel method, watershed, histogram, and cubic spline, to improve the contrast of digital images. The results obtained according to the RSME and NCC standards have proven that the spline method is the most accurate in the results compared to other statistical methods.
In recent years, there has been expanding development in the vehicular part and the number of vehicles moving on the road in all the sections of the country. Vehicle number plate identification based on image processing is a dynamic area of this work; this technique is used for security purposes such as tracking of stolen cars and access control to restricted areas. The License Plate Recognition System (LPRS) exploits a digital camera to capture vehicle plate numbers is used as input to the proposed recognition system. Basically, the developing system is consist of three phases, vehicle license plate localization, character segmentation, and character recognition, the License Plate (LP) detection is presented using canny
... Show MoreWater is an essential aspect of life and important in evolution. Recently the potable water quality topic has received much attention. The study aims to determine drinking water quality in Al-Najaf City by collecting samples throughout Al-Najaf city and comparing the results with the Iraqi guidelines (IQS 417) and World Health Organization (WHO) guidelines, as well as to calculate the WQI. Samples were tested in the laboratory between December 2021 and June 2022. The results showed that multiple parameters exceeded the allowable limits during both testing periods; during winter months, the results of TDS and turbidity exceeded the upper limits in multiple locations. Total hardness values also
... Show MoreIn some cases, researchers need to know the causal effect of the treatment in order to know the extent of the effect of the treatment on the sample in order to continue to give the treatment or stop the treatment because it is of no use. The local weighted least squares method was used to estimate the parameters of the fuzzy regression discontinuous model, and the local polynomial method was used to estimate the bandwidth. Data were generated with sample sizes (75,100,125,150 ) in repetition 1000. An experiment was conducted at the Innovation Institute for remedial lessons in 2021 for 72 students participating in the institute and data collection. Those who used the treatment had an increase in their score after
... Show MoreThe experiment was conducted in the glass house in a nursery at the growth season 2013.The experiment was designed by the Completely Randomized Blocks Design(CRBD).The seeds of two varieties of eggplant were studied.They were : 1.Lot (Number)Melaneana an American species,2.Aydinsiyah a Turkish species.We used three periods of water stress(1,8 ,16)days respectively, and three concentrations of proline acid (0,50,100) ppm using three frequents for each treatment.The experiment contained 54 experimental unit.The seeds were planted on the 30th/8/2013 in the glass house of the nursery, a month later, we put the plantelet in pots with good fertilized soil in the glass house.Some growth features were
... Show MoreThe present work aims to study the effect of using an automatic thresholding technique to convert the features edges of the images to binary images in order to split the object from its background, where the features edges of the sampled images obtained from first-order edge detection operators (Roberts, Prewitt and Sobel) and second-order edge detection operators (Laplacian operators). The optimum automatic threshold are calculated using fast Otsu method. The study is applied on a personal image (Roben) and a satellite image to study the compatibility of this procedure with two different kinds of images. The obtained results are discussed.
The present study discusses the significant role of the historical memory in all the Spanish society aspects of life. When a novelist takes the role and puts on the mask of one of the novel’s protagonists or hidden characters, his memory of the events becomes the keywords of accessing the close-knit fabric of society and sheds lights on deteriorating social conceptions in a backwards social reality that rejects all new progressive ideas and modernity. Through concentrating on the society flawing aspects and employing everything of his stored memory, the author uses sarcasm to criticize and change such old deteriorating reality conceptions.
&nbs
... Show MoreThe need to exchange large amounts of real-time data is constantly increasing in wireless communication. While traditional radio transceivers are not cost-effective and their components should be integrated, software-defined radio (SDR) ones have opened up a new class of wireless technologies with high security. This study aims to design an SDR transceiver was built using one type of modulation, which is 16 QAM, and adding a security subsystem using one type of chaos map, which is a logistic map, because it is a very simple nonlinear dynamical equations that generate a random key and EXCLUSIVE OR with the originally transmitted data to protect data through the transmission. At th
... Show MoreThe intellectual property of digital documents has been protected by using many methods of digital watermarking. Digital documents have been so much of advantages over print documents. Digital documents are less expensive and easy to store, transport, and searched compared to traditional print documents. But it has its owner limitation too. A simple image editor can be used to modify and make a forged document. Digital documents can be tampered easily. In order to utilize the whole benefits of digital document, these limitations have to overcome these limitations by embedding some text, logo sequence that identifies the owner of the document..
In this research LSB technique has been used
... Show MoreThe estimation of the parameters of linear regression is based on the usual Least Square method, as this method is based on the estimation of several basic assumptions. Therefore, the accuracy of estimating the parameters of the model depends on the validity of these hypotheses. The most successful technique was the robust estimation method which is minimizing maximum likelihood estimator (MM-estimator) that proved its efficiency in this purpose. However, the use of the model becomes unrealistic and one of these assumptions is the uniformity of the variance and the normal distribution of the error. These assumptions are not achievable in the case of studying a specific problem that may include complex data of more than one model. To
... Show More