Ensuring reliable data transmission in Network on Chip (NoC) is one of the most challenging tasks, especially in noisy environments. As crosstalk, interference, and radiation were increased with manufacturers' increasing tendency to reduce the area, increase the frequencies, and reduce the voltages. So many Error Control Codes (ECC) were proposed with different error detection and correction capacities and various degrees of complexity. Code with Crosstalk Avoidance and Error Correction (CCAEC) for network-on-chip interconnects uses simple parity check bits as the main technique to get high error correction capacity. Per this work, this coding scheme corrects up to 12 random errors, representing a high correction capacity compared with many other code schemes. This candidate has high correction capability but with a high codeword size. In this work, the CCAEC code is compared to another well-known code scheme called Horizontal-Vertical-Diagonal (HVD) error detecting and correcting code through reliability analysis by deriving a new accurate mathematical model for the probability of residual error Pres for both code schemes and confirming it by simulation results for both schemes. The results showed that the HVD code could correct all single, double, and triple errors and failed to correct only 3.3 % of states of quadric errors. In comparison, the CCAEC code can correct a single error and fails in 1.5%, 7.2%, and 16.4% cases of double, triple, and quadric errors, respectively. As a result, the HVD has better reliability than CCAEC and has lower overhead; making it a promising coding scheme to handle the reliability issues for NoC.
The present study discusses the significant role of the historical memory in all the Spanish society aspects of life. When a novelist takes the role and puts on the mask of one of the novel’s protagonists or hidden characters, his memory of the events becomes the keywords of accessing the close-knit fabric of society and sheds lights on deteriorating social conceptions in a backwards social reality that rejects all new progressive ideas and modernity. Through concentrating on the society flawing aspects and employing everything of his stored memory, the author uses sarcasm to criticize and change such old deteriorating reality conceptions.
&nbs
... Show MoreThe estimation of the parameters of linear regression is based on the usual Least Square method, as this method is based on the estimation of several basic assumptions. Therefore, the accuracy of estimating the parameters of the model depends on the validity of these hypotheses. The most successful technique was the robust estimation method which is minimizing maximum likelihood estimator (MM-estimator) that proved its efficiency in this purpose. However, the use of the model becomes unrealistic and one of these assumptions is the uniformity of the variance and the normal distribution of the error. These assumptions are not achievable in the case of studying a specific problem that may include complex data of more than one model. To
... Show MoreThe problem of frequency estimation of a single sinusoid observed in colored noise is addressed. Our estimator is based on the operation of the sinusoidal digital phase-locked loop (SDPLL) which carries the frequency information in its phase error after the noisy sinusoid has been acquired by the SDPLL. We show by computer simulations that this frequency estimator beats the Cramer-Rao bound (CRB) on the frequency error variance for moderate and high SNRs when the colored noise has a general low-pass filtered (LPF) characteristic, thereby outperforming, in terms of frequency error variance, several existing techniques some of which are, in addition, computationally demanding. Moreover, the present approach generalizes on existing work tha
... Show MoreThe total and individual multipole moments of magnetic electron scattering form factors in 41Ca have been investigated using a widely successful model which is the nuclear shell model configurations keeping in mind of 1f7/2 subshell as an L-S shell and Millinar, Baymann, Zamick as L-S shell (F7MBZ) to give the model space wave vector. Also, harmonic oscillator wave functions have been used as wave function of a single particle in 1f7/2 shell. Nucleus 40Ca as core closed and Core polarization effects have been used as a corrective with first order correction concept to basic computation of L-S shell and the excitement energy has been implemented with 2ћω. The
... Show MoreWireless Body Area Sensor Network (WBASN) is gaining significant attention due to its applications in smart health offering cost-effective, efficient, ubiquitous, and unobtrusive telemedicine. WBASNs face challenges including interference, Quality of Service, transmit power, and resource constraints. Recognizing these challenges, this paper presents an energy and Quality of Service-aware routing algorithm. The proposed algorithm is based on each node's Collaboratively Evaluated Value (CEV) to select the most suitable cluster head (CH). The Collaborative Value (CV) is derived from three factors, the node's residual energy, the distance vector between nodes and personal device, and the sensor's density in each CH. The CEV algorithm operates i
... Show MoreIn our research, we dealt with one of the most important issues of linguistic studies of the Holy Qur’an, which is the words that are close in meaning, which some believe are synonyms, but in the Arabic language they are not considered synonyms because there are subtle differences between them. Synonyms in the Arabic language are very few, rather rare, and in the Holy Qur’an they are completely non-existent. And how were these words, close in meaning, translated in the translation of the Holy Qur’an by Almir Kuliev into the Russian language.
Abstract
An experimental study was conducted for measuring the quality of surface finishing roughness using magnetic abrasive finishing technique (MAF) on brass plate which is very difficult to be polish by a conventional machining process where the cost is high and much more susceptible to surface damage as compared to other materials. Four operation parameters were studied, the gap between the work piece and the electromagnetic inductor, the current that generate the flux, the rotational Spindale speed and amount of abrasive powder size considering constant linear feed movement between machine head and workpiece. Adaptive Neuro fuzzy inference system (ANFIS) was implemented for evaluation of a serie
... Show MoreThe utilization of artificial intelligence techniques has garnered significant interest in recent research due to their pivotal role in enhancing the quality of educational offerings. This study investigated the impact of employing artificial intelligence techniques on improving the quality of educational services, as perceived by students enrolled in the College of Pharmacy at the University of Baghdad. The study sample comprised 379 male and female students. A descriptive-analytical approach was used, with a questionnaire as the primary tool for data collection. The findings indicated that the application of artificial intelligence methods was highly effective, and the educational services provided to students were of exceptional quality.
... Show MoreAbstract
Machining residual stresses correlate very closely with the cutting parameters and the tool geometries. This research work aims to investigate the effect of cutting speed, feed rate and depth of cut on the surface residual stress of steel AISI 1045 after face milling operation. After each milling test, the residual stress on the surface of the workpiece was measured by using X-ray diffraction technique. Design of Experiment (DOE) software was employed using the response surface methodology (RSM) technique with a central composite rotatable design to build a mathematical model to determine the relationship between the input variables and the response. The results showed that both
... Show MoreFor several applications, it is very important to have an edge detection technique matching human visual contour perception and less sensitive to noise. The edge detection algorithm describes in this paper based on the results obtained by Maximum a posteriori (MAP) and Maximum Entropy (ME) deblurring algorithms. The technique makes a trade-off between sharpening and smoothing the noisy image. One of the advantages of the described algorithm is less sensitive to noise than that given by Marr and Geuen techniques that considered to be the best edge detection algorithms in terms of matching human visual contour perception.