Ensuring reliable data transmission in Network on Chip (NoC) is one of the most challenging tasks, especially in noisy environments. As crosstalk, interference, and radiation were increased with manufacturers' increasing tendency to reduce the area, increase the frequencies, and reduce the voltages. So many Error Control Codes (ECC) were proposed with different error detection and correction capacities and various degrees of complexity. Code with Crosstalk Avoidance and Error Correction (CCAEC) for network-on-chip interconnects uses simple parity check bits as the main technique to get high error correction capacity. Per this work, this coding scheme corrects up to 12 random errors, representing a high correction capacity compared with many other code schemes. This candidate has high correction capability but with a high codeword size. In this work, the CCAEC code is compared to another well-known code scheme called Horizontal-Vertical-Diagonal (HVD) error detecting and correcting code through reliability analysis by deriving a new accurate mathematical model for the probability of residual error Pres for both code schemes and confirming it by simulation results for both schemes. The results showed that the HVD code could correct all single, double, and triple errors and failed to correct only 3.3 % of states of quadric errors. In comparison, the CCAEC code can correct a single error and fails in 1.5%, 7.2%, and 16.4% cases of double, triple, and quadric errors, respectively. As a result, the HVD has better reliability than CCAEC and has lower overhead; making it a promising coding scheme to handle the reliability issues for NoC.
Adverse drug reactions (ADR) are important information for verifying the view of the patient on a particular drug. Regular user comments and reviews have been considered during the data collection process to extract ADR mentions, when the user reported a side effect after taking a specific medication. In the literature, most researchers focused on machine learning techniques to detect ADR. These methods train the classification model using annotated medical review data. Yet, there are still many challenging issues that face ADR extraction, especially the accuracy of detection. The main aim of this study is to propose LSA with ANN classifiers for ADR detection. The findings show the effectiveness of utilizing LSA with ANN in extracting AD
... Show MoreIn general, the importance of cluster analysis is that one can evaluate elements by clustering multiple homogeneous data; the main objective of this analysis is to collect the elements of a single, homogeneous group into different divisions, depending on many variables. This method of analysis is used to reduce data, generate hypotheses and test them, as well as predict and match models. The research aims to evaluate the fuzzy cluster analysis, which is a special case of cluster analysis, as well as to compare the two methods—classical and fuzzy cluster analysis. The research topic has been allocated to the government and private hospitals. The sampling for this research was comprised of 288 patients being treated in 10 hospitals. As t
... Show MoreSummary:This article discusses the topic of phraseological units with the names of wild animals in the Russian and Arabic languages in the aspect of their comparative semantic and cultural analysis, since a comparative analysis of the meanings of phraseological units of the Arabic and Russian languages, detection of coincidences and differences in the compared languages, is an important method for studying linguoculturology, since phraseological units represent a reflection of culture in the language
The aim the research that definition on the impact a lot of Analysis and evaluation jobs impact in support the employees performance the property that are Analysis and evaluation jobs is one of the jobs however of the human resource management on organization and the impact footpace big on the chractericties and performance of the people and the impact that success of the organization , And here problem stool of the research in the omission the role for the Analysis and evaluation jobs impact in support the employees performance from the upward management in the organization , Polls were adopted as tools for obtaining data and the Depart
... Show MoreThis paper aims to identify the approaches used in assessment the credit applications by Iraqi banks, as well as which approach is most used. It also attempted to link these approaches with reduction of credit default and banks’ efficiency particularly for the Gulf Commercial Bank. The paper found that the Gulf Bank widely relies on the method of Judgment Approach for assessment the credit applications in order to select the best of them with low risk of default. In addition, the paper found that the method of Judgment Approach was very important for the Gulf Bank and it driven in reduction the ratio of credit default as percentage of total credit. However, it is important to say that the adoption of statistical approaches for
... Show MoreLinear discriminant analysis and logistic regression are the most widely used in multivariate statistical methods for analysis of data with categorical outcome variables .Both of them are appropriate for the development of linear classification models .linear discriminant analysis has been that the data of explanatory variables must be distributed multivariate normal distribution. While logistic regression no assumptions on the distribution of the explanatory data. Hence ,It is assumed that logistic regression is the more flexible and more robust method in case of violations of these assumptions.
In this paper we have been focus for the comparison between three forms for classification data belongs
... Show MoreBackground: Measuring implant stability is an important issue in predicting treatment success. Dental implant stability is usually measured through resonance frequency analysis (RFA). Osstell® RFA devices can be used with transducers (Smartpeg™) that correspond to the implants used as well as with transducers designed for application with Penguin® RFA devices (Multipeg™). Aims: This study aims to assess the reliability of a MultiPeg™ transducer with an Osstell® device in measuring dental implant stability. Materials and Methods: Sixteen healthy participants who required dental implant treatment were enrolled in this study. Implant stability was measured by using an Osstell® device with two transducers, namely, Smartpeg™ and M
... Show MoreThis paper includes a comparison between denoising techniques by using statistical approach, principal component analysis with local pixel grouping (PCA-LPG), this procedure is iterated second time to further improve the denoising performance, and other enhancement filters were used. Like adaptive Wiener low pass-filter to a grayscale image that has been degraded by constant power additive noise, based on statistics estimated from a local neighborhood of each pixel. Performs Median filter of the input noisy image, each output pixel contains the Median value in the M-by-N neighborhood around the corresponding pixel in the input image, Gaussian low pass-filter and Order-statistic filter also be used.
Experimental results shows LPG-
... Show MoreThis paper deals with constructing mixed probability distribution from exponential with scale parameter (β) and also Gamma distribution with (2,β), and the mixed proportions are ( .first of all, the probability density function (p.d.f) and also cumulative distribution function (c.d.f) and also the reliability function are obtained. The parameters of mixed distribution, ( ,β) are estimated by three different methods, which are maximum likelihood, and Moments method,as well proposed method (Differential Least Square Method)(DLSM).The comparison is done using simulation procedure, and all the results are explained in tables.