Preferred Language
Articles
/
joe-1129
Reliability Analysis of Multibit Error Correcting Coding and Comparison to Hamming Product Code for On-Chip Interconnect
...Show More Authors

Error control schemes became a necessity in network-on-chip (NoC) to improve reliability as the on-chip interconnect errors increase with the continuous shrinking of geometry. Accordingly, many researchers are trying to present multi-bit error correction coding schemes that perform a high error correction capability with the simplest design possible to minimize area and power consumption. A recent work, Multi-bit Error Correcting Coding with Reduced Link Bandwidth (MECCRLB), showed a huge reduction in area and power consumption compared to a well-known scheme, namely, Hamming product code (HPC) with Type-II HARQ. Moreover, the authors showed that the proposed scheme can correct 11 random errors which is considered a high number of errors to be corrected by any scheme used in NoC. The high correction capability with moderate number of check bits along with the reduction in power and area requires further investigation in the accuracy of the reliability model. In this paper, reliability analysis is performed by modeling the residual error probability Presidual which represents the probability of decoder error or failure. New model to estimate Presidual of MECCRLB is derived, validated against simulation, and compared to HPC to assess the capability of MECCRLB. The results show that HPC outperforms MECCRLB from reliability perspective. The former corrects all single and double errors, and fails in 5.18% cases of the triple errors, whereas the latter is found to correct all single errors but fails in 32.5% of double errors and 38.97% of triple errors.

 

Crossref
View Publication Preview PDF
Quick Preview PDF
Publication Date
Sat Mar 01 2008
Journal Name
Al-khwarizmi Engineering Journal
Minimizing error in robot arm based on design optimization for high stiffness to weight ratio
...Show More Authors

In this work the effect of choosing tri-circular tube section had been addressed to minimize the end effector’s error, a comparison had been made between the tri-tube section and the traditional square cross section for a robot arm, the study shows that for the same weight of square section and tri-tube section the error may be reduced by about 33%.

A program had been built up by the use of MathCAD software to calculate the minimum weight of a square section robot arm that could with stand a given pay load and gives a minimum deflection. The second part of the program makes an optimization process for the dimension of the cross section and gives the dimensions of the tri-circular tube cross section that have the same weight of

... Show More
View Publication Preview PDF
Publication Date
Tue Sep 08 2020
Journal Name
Baghdad Science Journal
A comparison among Different Methods for Estimating Regression Parameters with Autocorrelation Problem under Exponentially Distributed Error
...Show More Authors

Multiple linear regressions are concerned with studying and analyzing the relationship between the dependent variable and a set of explanatory variables. From this relationship the values of variables are predicted. In this paper the multiple linear regression model and three covariates were studied in the presence of the problem of auto-correlation of errors when the random error distributed the distribution of exponential. Three methods were compared (general least squares, M robust, and Laplace robust method). We have employed the simulation studies and calculated the statistical standard mean squares error with sample sizes (15, 30, 60, 100). Further we applied the best method on the real experiment data representing the varieties of

... Show More
View Publication Preview PDF
Scopus Clarivate Crossref
Publication Date
Sat Jan 01 2022
Journal Name
International Journal Of Agricultural And Statistical Sciences
ON ERROR DISTRIBUTION WITH SINGLE INDEX MODEL
...Show More Authors

In this paper, the error distribution function is estimated for the single index model by the empirical distribution function and the kernel distribution function. Refined minimum average variance estimation (RMAVE) method is used for estimating single index model. We use simulation experiments to compare the two estimation methods for error distribution function with different sample sizes, the results show that the kernel distribution function is better than the empirical distribution function.

Scopus
Publication Date
Thu Apr 18 2024
Journal Name
Geomatics And Environmental Engineering
Error Analysis of Stonex X300 Laser Scanner Close-range Measurements
...Show More Authors

This research reports an error analysis of close-range measurements from a Stonex X300 laser scanner in order to address range uncertainty behavior based on indoor experiments under fixed environmental conditions. The analysis includes procedures for estimating the precision and accuracy of the observational errors estimated from the Stonex X300 observations and conducted at intervals of 5 m within a range of 5 to 30 m. The laser 3D point cloud data of the individual scans is analyzed following a roughness analysis prior to the implementation of a Levenberg–Marquardt iterative closest points (LM-ICP) registration. This leads to identifying the level of roughness that was encountered due to the range-finder’s limitations in close

... Show More
View Publication
Scopus (1)
Crossref (1)
Scopus Clarivate Crossref
Publication Date
Sun Oct 01 2023
Journal Name
Baghdad Science Journal
Numerical Investigation, Error Analysis and Application of Joint Quadrature Scheme in Physical Sciences
...Show More Authors

In this work, a joint quadrature for numerical solution of the double integral is presented. This method is based on combining two rules of the same precision level to form a higher level of precision. Numerical results of the present method with a lower level of precision are presented and compared with those performed by the existing high-precision Gauss-Legendre five-point rule in two variables, which has the same functional evaluation. The efficiency of the proposed method is justified with numerical examples. From an application point of view, the determination of the center of gravity is a special consideration for the present scheme. Convergence analysis is demonstrated to validate the current method.

View Publication Preview PDF
Scopus (16)
Crossref (1)
Scopus Crossref
Publication Date
Mon Sep 01 2008
Journal Name
Al-khwarizmi Engineering Journal
Correcting Working Postures in Water Pump Assembly Tasks using the OVAKO Work Analysis System (OWAS)
...Show More Authors

Ovako Working Postures Analyzing System (OWAS) is a widely used method for studying awkward working postures in workplaces. This study with OWAS, analyzed working postures for manual material handling of laminations at stacking workstation for water pump assembly line in Electrical Industrial Company (EICO) / Baghdad. A computer program, WinOWAS, was used for the study. In real life workstation was found that more than 26% of the working postures observed were classified as either AC2 (slightly harmful), AC3 (distinctly harmful). Postures that needed to be corrected soon (AC3) and corresponding tasks, were identified. The most stressful tasks observed were grasping, handling, and positioning of the laminations from workers. The construct

... Show More
View Publication Preview PDF
Publication Date
Sun Jul 12 2020
Journal Name
International Journal Of Research In Social Sciences And Humanities
THE USE OF PRODUCT LIFE CYCLE ASSESSMENT TECHNOLOGY TO ACHIEVE PRODUCT SUSTAINABILITY
...Show More Authors

This research aims to demonstrate the knowledge pillars of the product life cycle assessment technique and how to measure the cost according to this technique, and to clarify its role in reducing costs, improving product quality and optimizing the use of available resources, and a set of results has been reached, the most important of which are: The separation of environmental costs through the use of product life cycle assessment technique helps the Management in handling the increase of these costs, reducing the rates of environmental pollution and preserving resources, which contributes to achieving the sustainability of the product, and based on the results obtained, a set of recommendations were presented, the most important of which w

... Show More
View Publication Preview PDF
Crossref
Publication Date
Sun Jun 12 2011
Journal Name
Baghdad Science Journal
An algorithm for binary codebook design based on the average bitmap replacement error (ABPRE)
...Show More Authors

In this paper, an algorithm for binary codebook design has been used in vector quantization technique, which is used to improve the acceptability of the absolute moment block truncation coding (AMBTC) method. Vector quantization (VQ) method is used to compress the bitmap (the output proposed from the first method (AMBTC)). In this paper, the binary codebook can be engender for many images depending on randomly chosen to the code vectors from a set of binary images vectors, and this codebook is then used to compress all bitmaps of these images. The chosen of the bitmap of image in order to compress it by using this codebook based on the criterion of the average bitmap replacement error (ABPRE). This paper is suitable to reduce bit rates

... Show More
View Publication Preview PDF
Crossref
Publication Date
Sat Mar 31 2018
Journal Name
Iraqi Journal Of Chemical And Petroleum Engineering
A Comparison between the Product-Refill and the Equalization Oxygen Pressure Swing Adsorption Processes
...Show More Authors

This work presents a design for a pressure swing adsorption process (PSA) to separate oxygen from air with approximately 95% purity, suitable for different numbers of columns and arrangements. The product refill PSA process was found to perform 33% better (weight of zeolite required or productivity) than the pressure equalization process. The design is based on the adsorption equilibrium of a binary mixture of O2 and N2 for two of the most commonly used adsorbents, 5A & 13X, and extension from a single column approach. Zeolite 13X was found to perform 6% better than zeolite 5A. The most effective variables were determined to be the adsorption step time and the operational pressure. Increasing the adsorption step

... Show More
View Publication Preview PDF
Publication Date
Tue Dec 05 2023
Journal Name
Baghdad Science Journal
AlexNet Convolutional Neural Network Architecture with Cosine and Hamming Similarity/Distance Measures for Fingerprint Biometric Matching
...Show More Authors

In information security, fingerprint verification is one of the most common recent approaches for verifying human identity through a distinctive pattern. The verification process works by comparing a pair of fingerprint templates and identifying the similarity/matching among them. Several research studies have utilized different techniques for the matching process such as fuzzy vault and image filtering approaches. Yet, these approaches are still suffering from the imprecise articulation of the biometrics’ interesting patterns. The emergence of deep learning architectures such as the Convolutional Neural Network (CNN) has been extensively used for image processing and object detection tasks and showed an outstanding performance compare

... Show More
View Publication Preview PDF
Scopus (4)
Crossref (2)
Scopus Crossref