The undetected error probability is an important measure to assess the communication reliability provided by any error coding scheme. Two error coding schemes namely, Joint crosstalk avoidance and Triple Error Correction (JTEC) and JTEC with Simultaneous Quadruple Error Detection (JTEC-SQED), provide both crosstalk reduction and multi-bit error correction/detection features. The available undetected error probability model yields an upper bound value which does not give accurate estimation on the reliability provided. This paper presents an improved mathematical model to estimate the undetected error probability of these two joint coding schemes. According to the decoding algorithm the errors are classified into patterns and their decoding result is checked for failures. The probabilities of the failing patterns are used to build the new models. The improved models have less than 1% error
In this work the effect of choosing tri-circular tube section had been addressed to minimize the end effector’s error, a comparison had been made between the tri-tube section and the traditional square cross section for a robot arm, the study shows that for the same weight of square section and tri-tube section the error may be reduced by about 33%.
A program had been built up by the use of MathCAD software to calculate the minimum weight of a square section robot arm that could with stand a given pay load and gives a minimum deflection. The second part of the program makes an optimization process for the dimension of the cross section and gives the dimensions of the tri-circular tube cross section that have the same weight of
... Show MoreIn this paper, we will focus to one of the recent applications of PU-algebras in the coding theory, namely the construction of codes by soft sets PU-valued functions. First, we shall introduce the notion of soft sets PU-valued functions on PU-algebra and investigate some of its related properties.Moreover, the codes generated by a soft sets PU-valued function are constructed and several examples are given. Furthermore, example with graphs of binary block code constructed from a soft sets PU-valued function is constructed.
Ischemic stroke is a significant cause of morbidity and mortality worldwide. Autophagy, a process of intracellular degradation, has been shown to play a crucial role in the pathogenesis of ischemic stroke. Long non-coding RNAs (lncRNAs) have emerged as essential regulators of autophagy in various diseases, including ischemic stroke. Recent studies have identified several lncRNAs that modulate autophagy in ischemic stroke, including MALAT1, MIAT, SNHG12, H19, AC136007. 2, C2dat2, MEG3, KCNQ1OT1, SNHG3, and RMRP. These lncRNAs regulate autophagy by interacting with key proteins involved in the autophagic process, such as Beclin-1, ATG7, and LC3. Understanding the role of lncRNAs in regulating auto
The alternating direction implicit method (ADI) is a common classical numerical method that was first introduced to solve the heat equation in two or more spatial dimensions and can also be used to solve parabolic and elliptic partial differential equations as well. In this paper, We introduce an improvement to the alternating direction implicit (ADI) method to get an equivalent scheme to Crank-Nicolson differences scheme in two dimensions with the main feature of ADI method. The new scheme can be solved by similar ADI algorithm with some modifications. A numerical example was provided to support the theoretical results in the research.
This paper deals with constructing mixed probability distribution from mixing exponential
In this paper, a method is proposed to increase the compression ratio for the color images by
dividing the image into non-overlapping blocks and applying different compression ratio for these
blocks depending on the importance information of the block. In the region that contain important
information the compression ratio is reduced to prevent loss of the information, while in the
smoothness region which has not important information, high compression ratio is used .The
proposed method shows better results when compared with classical methods(wavelet and DCT).
Abstract
Bivariate time series modeling and forecasting have become a promising field of applied studies in recent times. For this purpose, the Linear Autoregressive Moving Average with exogenous variable ARMAX model is the most widely used technique over the past few years in modeling and forecasting this type of data. The most important assumptions of this model are linearity and homogenous for random error variance of the appropriate model. In practice, these two assumptions are often violated, so the Generalized Autoregressive Conditional Heteroscedasticity (ARCH) and (GARCH) with exogenous varia
... Show More