Preferred Language
Articles
/
joe-1129
Reliability Analysis of Multibit Error Correcting Coding and Comparison to Hamming Product Code for On-Chip Interconnect
...Show More Authors

Error control schemes became a necessity in network-on-chip (NoC) to improve reliability as the on-chip interconnect errors increase with the continuous shrinking of geometry. Accordingly, many researchers are trying to present multi-bit error correction coding schemes that perform a high error correction capability with the simplest design possible to minimize area and power consumption. A recent work, Multi-bit Error Correcting Coding with Reduced Link Bandwidth (MECCRLB), showed a huge reduction in area and power consumption compared to a well-known scheme, namely, Hamming product code (HPC) with Type-II HARQ. Moreover, the authors showed that the proposed scheme can correct 11 random errors which is considered a high number of errors to be corrected by any scheme used in NoC. The high correction capability with moderate number of check bits along with the reduction in power and area requires further investigation in the accuracy of the reliability model. In this paper, reliability analysis is performed by modeling the residual error probability Presidual which represents the probability of decoder error or failure. New model to estimate Presidual of MECCRLB is derived, validated against simulation, and compared to HPC to assess the capability of MECCRLB. The results show that HPC outperforms MECCRLB from reliability perspective. The former corrects all single and double errors, and fails in 5.18% cases of the triple errors, whereas the latter is found to correct all single errors but fails in 32.5% of double errors and 38.97% of triple errors.

 

Crossref
View Publication Preview PDF
Quick Preview PDF
Publication Date
Tue Mar 21 2023
Journal Name
International Journal Of Professional Business Review
Analysis of the Impact of Six Sigma and Risk Management on Iraq's Energy Sector Metrology
...Show More Authors

Purpose: aims the study to show How to be can to enhance measurement management by incorporating a risk-based approach and the six sigma method into a more thorough assessment of metrological performance.   Theoretical framework: Recent literature has recorded good results in analyzing the impact of Six Sigma and risk management on the energy sector (Barrera García et al., 2022) (D'Emilia et al. 2015). However, this research came to validate and emphasize the most comprehensive assessment of metrological performance by integrating Risk management based approach and Six Sigma analysis.   Design/methodology/approach: This study was conducted in Iraqi petroleum refining companies. System quality is measured in terms of sigmas, and t

... Show More
Crossref
Publication Date
Mon Jun 01 2015
Journal Name
International Journal Of Advanced Research In Computer Science And Software Engineering
Performance Comparison of Transport Layer Protocols
...Show More Authors

Transport layer is responsible for delivering data to the appropriate application process on the host computers. The two most popular transport layer protocols are Transmission Control Protocol (TCP) and User Datagram Protocol (UDP). TCP is considered one of the most important protocols in the Internet. UDP is a minimal message-oriented Transport Layer protocol. In this paper we have compared the performance of TCP and UDP on the wired network. Network Simulator (NS2) has been used for performance Comparison since it is preferred by the networking research community. Constant bit rate (CBR) traffic used for both TCP and UDP protocols.

View Publication
Publication Date
Tue Mar 03 2009
Journal Name
Journal Of Economics And Administrative Sciences
Comparison of repetitive estimation methodsSelf-data
...Show More Authors

In this study, we review the ARIMA (p, d, q), the EWMA and the DLM (dynamic linear moodelling) procedures in brief in order to accomdate the ac(autocorrelation)  structure of data .We consider the recursive estimation and prediction algorithms based on Bayes and KF (Kalman filtering) techniques for correlated observations.We investigate the effect on the MSE of  these procedures and compare them using generated data.

View Publication Preview PDF
Crossref
Publication Date
Wed Jun 30 2021
Journal Name
Journal Of Economics And Administrative Sciences
Comparison of Hurst exponent estimation methods
...Show More Authors

Through recent years many researchers have developed methods to estimate the self-similarity and long memory parameter that is best known as the Hurst parameter. In this paper, we set a comparison between nine different methods. Most of them use the deviations slope to find an estimate for the Hurst parameter like Rescaled range (R/S), Aggregate Variance (AV), and Absolute moments (AM), and some depend on filtration technique like Discrete Variations (DV), Variance versus level using wavelets (VVL) and Second-order discrete derivative using wavelets (SODDW) were the comparison set by a simulation study to find the most efficient method through MASE. The results of simulation experiments were shown that the performance of the meth

... Show More
View Publication Preview PDF
Crossref (2)
Crossref
Publication Date
Tue Jun 01 2021
Journal Name
Journal Of Economics And Administrative Sciences
Comparison of Hurst exponent estimation methods
...Show More Authors

Publication Date
Thu Jun 01 2017
Journal Name
Journal Of Economics And Administrative Sciences
Customers emotional blackmail and reduce it the new product- study of the opinions of a sample of customers who deal with peak economy for household items in najaf al Ashraf
...Show More Authors

The challenges facing today's multi-customer and this is due to the multiplicity of products and speed in launching new products so search came to reveal the  reveal the of the new product classification standards through a relationship (good products, low interest products, useful products and products desired) and the customer emotionally blackmail through deportation (fear, obligation and guilt). dentified the problem of the research in several questions focused on the nature of the relationship between the variables of research, and for that outline supposedly to search it expresses the head of one hypothesis and branched out of which four hypotheses subset, but in order to ensure the validity of the ass

... Show More
View Publication Preview PDF
Crossref
Publication Date
Sun Oct 01 2017
Journal Name
Journal Of Economics And Administrative Sciences
''The use of factor analysis to identify the leading factors to high blood pressure.''A field study in Baghdad hospitals
...Show More Authors

Abstract :

    In view of the fact that high blood pressure is one of the serious human diseases that a person can get without having to feel them, which is caused by many reasons therefore it became necessary to do research in this subject and to express these many factors by specific causes through studying it using (factor analysis).

  So the researcher got to the five factors that explains only 71% of the total variation in this phenomenon is the subject of the research, where ((overweight)) and ((alcohol in abundance)) and ((smoking)) and ((lack of exercise)) are the reasons that influential the most in the incidence of this disease.

View Publication Preview PDF
Crossref
Publication Date
Tue Dec 01 2020
Journal Name
Gulf Economist
The Bayesian Estimation in Competing Risks Analysis for Discrete Survival Data under Dynamic Methodology with Application to Dialysis Patients in Basra/ Iraq
...Show More Authors

Survival analysis is one of the types of data analysis that describes the time period until the occurrence of an event of interest such as death or other events of importance in determining what will happen to the phenomenon studied. There may be more than one endpoint for the event, in which case it is called Competing risks. The purpose of this research is to apply the dynamic approach in the analysis of discrete survival time in order to estimate the effect of covariates over time, as well as modeling the nonlinear relationship between the covariates and the discrete hazard function through the use of the multinomial logistic model and the multivariate Cox model. For the purpose of conducting the estimation process for both the discrete

... Show More
View Publication Preview PDF
Publication Date
Fri Jan 01 2021
Journal Name
International Journal Of Agricultural And Statistical Sciences
DYNAMIC MODELING FOR DISCRETE SURVIVAL DATA BY USING ARTIFICIAL NEURAL NETWORKS AND ITERATIVELY WEIGHTED KALMAN FILTER SMOOTHING WITH COMPARISON
...Show More Authors

Survival analysis is widely applied in data describing for the life time of item until the occurrence of an event of interest such as death or another event of understudy . The purpose of this paper is to use the dynamic approach in the deep learning neural network method, where in this method a dynamic neural network that suits the nature of discrete survival data and time varying effect. This neural network is based on the Levenberg-Marquardt (L-M) algorithm in training, and the method is called Proposed Dynamic Artificial Neural Network (PDANN). Then a comparison was made with another method that depends entirely on the Bayes methodology is called Maximum A Posterior (MAP) method. This method was carried out using numerical algorithms re

... Show More
Preview PDF
Scopus (1)
Scopus
Publication Date
Fri Sep 15 2017
Journal Name
Journal Of Baghdad College Of Dentistry
Comparison of Shear Bond Strength of Sapphire Bracket Bonded to Zirconium Surface after Using Different Surface Conditioning Methods (In Vitro Study)
...Show More Authors

Background: The present study was carried out to compare shear bond strength of sapphire bracket bonded to zirconium surface after using different methods of surface conditioning and assessment of the adhesive remnant index. Materials and methods: The sample composed of 40 zirconium specimens divided into four groups; the first group was the control, the second group was conditioned by sandblast with aluminum oxide particle 50 μm, the third and fourth group was treated by (Nd: YAG) laser (1064nm)(0.888 Watt for 5 seconds) for the 1st laser group and (0.444 Watt for 10 seconds) for the 2nd laser group. All samples were coated by z-prime plus primer. A central incisor sapphire bracket was bonded to all samples with light cure adhesive res

... Show More
View Publication Preview PDF
Crossref (1)
Crossref