Achieving reliable operation under the influence of deep-submicrometer noise sources including crosstalk noise at low voltage operation is a major challenge for network on chip links. In this paper, we propose a coding scheme that simultaneously addresses crosstalk effects on signal delay and detects up to seven random errors through wire duplication and simple parity checks calculated over the rows and columns of the two-dimensional data. This high error detection capability enables the reduction of operating voltage on the wire leading to energy saving. The results show that the proposed scheme reduces the energy consumption up to 53% as compared to other schemes at iso-reliability performance despite the increase in the overhead number of wires. In addition, it has small penalty on the network performance, represented by the average latency and comparable codec area overhead to other schemes.
The principle of an interview requires revenues to expenditures by linking the efforts and achievements and disclosure sufficient to result activity, in the case generate future benefits of a particular asset, this asset appears in the balance sheet to reflect with the rest of the accounting unit's assets on the strength of financial position In the absence of future benefits from the effort are so loaded effort on the result accounts that reflect the outcome of activity during a specific period if the month or be separated or fiscal year.
The researcher reached the following conclusions:
1- difficult to control the cash inflows and outflows as a result of the multiplicity of sources of funding.
2- wea
... Show MoreMost companies use social media data for business. Sentiment analysis automatically gathers analyses and summarizes this type of data. Managing unstructured social media data is difficult. Noisy data is a challenge to sentiment analysis. Since over 50% of the sentiment analysis process is data pre-processing, processing big social media data is challenging too. If pre-processing is carried out correctly, data accuracy may improve. Also, sentiment analysis workflow is highly dependent. Because no pre-processing technique works well in all situations or with all data sources, choosing the most important ones is crucial. Prioritization is an excellent technique for choosing the most important ones. As one of many Multi-Criteria Decision Mak
... Show MoreThis research reports an error analysis of close-range measurements from a Stonex X300 laser scanner in order to address range uncertainty behavior based on indoor experiments under fixed environmental conditions. The analysis includes procedures for estimating the precision and accuracy of the observational errors estimated from the Stonex X300 observations and conducted at intervals of 5 m within a range of 5 to 30 m. The laser 3D point cloud data of the individual scans is analyzed following a roughness analysis prior to the implementation of a Levenberg–Marquardt iterative closest points (LM-ICP) registration. This leads to identifying the level of roughness that was encountered due to the range-finder’s limitations in close
... Show MoreThe purpose of the present work is to calculate the expectation value of potential energy for different spin states (??? ? ???,??? ? ???) and compared it with spin states (??? , ??? ) for lithium excited state (1s2s3s) and Li- like ions (Be+,B+2) using Hartree-Fock wave function by partitioning techanique .The result of inter particle expectation value shows linear behaviour with atomic number and for each atom and ion the shows the trend ??? < ??? < ??? < ???
The energy expectation values for Li and Li-like ions ( , and ) have been calculated and examined within the ground state and the excited state in position space. The partitioning technique of Hartree-Fock (H-F) has been used for existing wave functions.
Exchange of information through the channels of communication can be unsafe. Communication media are not safe to send sensitive information so it is necessary to provide the protection of information from disclosure to unauthorized persons. This research presented the method to information security is done through information hiding into the cover image using a least significant bit (LSB) technique, where a text file is encrypted using a secret sharing scheme. Then, generating positions to hiding information in a random manner of cover image, which is difficult to predict hiding in the image-by-image analysis or statistical analyzes. Where it provides two levels of information security through encryption of a text file using the secret sha
... Show MoreSome problems want to be solved in image compression to make the process workable and more efficient. Much work had been done in the field of lossy image compression based on wavelet and Discrete Cosine Transform (DCT). In this paper, an efficient image compression scheme is proposed, based on a common encoding transform scheme; It consists of the following steps: 1) bi-orthogonal (tab 9/7) wavelet transform to split the image data into sub-bands, 2) DCT to de-correlate the data, 3) the combined transform stage's output is subjected to scalar quantization before being mapped to positive, 4) and LZW encoding to produce the compressed data. The peak signal-to-noise (PSNR), compression ratio (CR), and compression gain (CG) measures were used t
... Show MoreWe explore the transform coefficients of fractal and exploit new method to improve the compression capabilities of these schemes. In most of the standard encoder/ decoder systems the quantization/ de-quantization managed as a separate step, here we introduce new way (method) to work (managed) simultaneously. Additional compression is achieved by this method with high image quality as you will see later.