Ensuring reliable data transmission in Network on Chip (NoC) is one of the most challenging tasks, especially in noisy environments. As crosstalk, interference, and radiation were increased with manufacturers' increasing tendency to reduce the area, increase the frequencies, and reduce the voltages. So many Error Control Codes (ECC) were proposed with different error detection and correction capacities and various degrees of complexity. Code with Crosstalk Avoidance and Error Correction (CCAEC) for network-on-chip interconnects uses simple parity check bits as the main technique to get high error correction capacity. Per this work, this coding scheme corrects up to 12 random errors, representing a high correction capacity compared with many other code schemes. This candidate has high correction capability but with a high codeword size. In this work, the CCAEC code is compared to another well-known code scheme called Horizontal-Vertical-Diagonal (HVD) error detecting and correcting code through reliability analysis by deriving a new accurate mathematical model for the probability of residual error Pres for both code schemes and confirming it by simulation results for both schemes. The results showed that the HVD code could correct all single, double, and triple errors and failed to correct only 3.3 % of states of quadric errors. In comparison, the CCAEC code can correct a single error and fails in 1.5%, 7.2%, and 16.4% cases of double, triple, and quadric errors, respectively. As a result, the HVD has better reliability than CCAEC and has lower overhead; making it a promising coding scheme to handle the reliability issues for NoC.
Specialized hardware implementations of Artificial Neural Networks (ANNs) can offer faster execution than general-purpose microprocessors by taking advantage of reusable modules, parallel processes and specialized computational components. Modern high-density Field Programmable Gate Arrays (FPGAs) offer the required flexibility and fast design-to-implementation time with the possibility of exploiting highly parallel computations like those required by ANNs in hardware. The bounded width of the data in FPGA ANNs will add an additional error to the result of the output. This paper derives the equations of the additional error value that generate from bounded width of the data and proposed a method to reduce the effect of the error to give
... Show MoreThis paper presents two main parts: The first part involves manufacturing the specimens form composite material for mechanical testing (tensile, flexural and fatigue tests), then design a custom foot orthesis (CFO) and manufacturing from composite lamination (3nylglass 2carbon fiber 3nylglass) for patient suffer from flexible flat foot since birth and over-pronation. The second part of this research involves a design a model of custom foot orthesis in (solid work 2018) and then analysis of custom foot orthosis in engineering analysis program (ANSYS V.18.2).The applied pressure in boundary condition adopted from Force Sensor Resistance (FSR 402 ) in various regions in foot after wearing composite CFO. Used a composite materials in engineerin
... Show MoreIs in this research review of the way minimum absolute deviations values based on linear programming method to estimate the parameters of simple linear regression model and give an overview of this model. We were modeling method deviations of the absolute values proposed using a scale of dispersion and composition of a simple linear regression model based on the proposed measure. Object of the work is to find the capabilities of not affected by abnormal values by using numerical method and at the lowest possible recurrence.
In this work, the performance of the receiver in a quantum cryptography system based on BB84 protocol is scaled by calculating the Quantum Bit Error Rate (QBER) of the receiver. To apply this performance test, an optical setup was arranged and a circuit was designed and implemented to calculate the QBER. This electronic circuit is used to calculate the number of counts per second generated by the avalanche photodiodes set in the receiver. The calculated counts per second are used to calculate the QBER for the receiver that gives an indication for the performance of the receiver. Minimum QBER, 6%, was obtained with avalanche photodiode excess voltage equals to 2V and laser diode power of 3.16 nW at avalanche photodiode temperature of -10
... Show MoreHorizontal wells have revolutionized hydrocarbon production by enhancing recovery efficiency and reducing environmental impact. This paper presents an enhanced Black Oil Model simulator, written in Visual Basic, for three-dimensional two-phase (oil and water) flow through porous media. Unlike most existing tools, this simulator is customized for horizontal well modeling and calibrated using extensive historical data from the South Rumaila Oilfield, Iraq. The simulator first achieves a strong match with historical pressure data (1954–2004) using vertical wells, with an average deviation of less than 5% from observed pressures, and is then applied to forecast the performance of hypothetical horizontal wells (2008–2011). The result
... Show MoreCholelithiasis is one of the commonest surgical problems and one of the most common gastrointestinal diseases throughout the world but its pathogenesis remains unclear. Many theories have been proposed forward to explain the mechanism of stone formation. It is not fully clear if symptomatic gallstone disease is associated with a specific pattern of some biochemical abnormalities, as lipid profile and fasting blood sugar in serum of patients.
This study was designed to estimate lipid profile and fasting blood sugar in the sera of patients with cholelithiasis in comparison with normal individuals (control).
In this study, 104(male=16, female=88) were symptomatic gallstone patients (aged 42.79± 12.18 years), and 38(male=6
... Show MoreThe interplay of predation, competition between species and harvesting is one of the most critical aspects of the environment. This paper involves exploring the dynamics of four species' interactions. The system includes two competitive prey and two predators; the first prey is preyed on by the first predator, with the former representing an additional food source for the latter. While the second prey is not exposed to predation but rather is exposed to the harvest. The existence of possible equilibria is found. Conditions of local and global stability for the equilibria are derived. To corroborate our findings, we constructed time series to illustrate the existence and the stability of equilibria numerically by varying the different values
... Show MoreAdverse drug reactions (ADR) are important information for verifying the view of the patient on a particular drug. Regular user comments and reviews have been considered during the data collection process to extract ADR mentions, when the user reported a side effect after taking a specific medication. In the literature, most researchers focused on machine learning techniques to detect ADR. These methods train the classification model using annotated medical review data. Yet, there are still many challenging issues that face ADR extraction, especially the accuracy of detection. The main aim of this study is to propose LSA with ANN classifiers for ADR detection. The findings show the effectiveness of utilizing LSA with ANN in extracting AD
... Show MoreThis paper presents a comparative study between different oil production enhancement scenarios in the Saadi tight oil reservoir located in the Halfaya Iraqi oil field. The reservoir exhibits poor petrophysical characteristics, including medium pore size, low permeability (reaching zero in some areas), and high porosity of up to 25%. Previous stimulation techniques such as acid fracturing and matrix acidizing have yielded low oil production in this reservoir. Therefore, the feasibility of hydraulic fracturing stimulation and/or horizontal well drilling scenarios was assessed to increase the production rate. While horizontal drilling and hydraulic fracturing can improve well performance, they come with high costs, often accounting for up t
... Show MoreIn general, the importance of cluster analysis is that one can evaluate elements by clustering multiple homogeneous data; the main objective of this analysis is to collect the elements of a single, homogeneous group into different divisions, depending on many variables. This method of analysis is used to reduce data, generate hypotheses and test them, as well as predict and match models. The research aims to evaluate the fuzzy cluster analysis, which is a special case of cluster analysis, as well as to compare the two methods—classical and fuzzy cluster analysis. The research topic has been allocated to the government and private hospitals. The sampling for this research was comprised of 288 patients being treated in 10 hospitals. As t
... Show More