Achieving reliable operation under the influence of deep-submicrometer noise sources including crosstalk noise at low voltage operation is a major challenge for network on chip links. In this paper, we propose a coding scheme that simultaneously addresses crosstalk effects on signal delay and detects up to seven random errors through wire duplication and simple parity checks calculated over the rows and columns of the two-dimensional data. This high error detection capability enables the reduction of operating voltage on the wire leading to energy saving. The results show that the proposed scheme reduces the energy consumption up to 53% as compared to other schemes at iso-reliability performance despite the increase in the overhead number of wires. In addition, it has small penalty on the network performance, represented by the average latency and comparable codec area overhead to other schemes.
The simulation study has been conducted for the harmonics of Nd: YAG laser, namely the second harmonic generation SHG, the third harmonic generation THG, and the fourth harmonic generation FHG. Determination of beam expander's expansion ratio for specific wavelength and given detection range is the key in beam expander design for determining minimum laser spot size at the target. Knowing optimum expansion ratio decreases receiving unit dimensions and increases its performance efficiency. Simulation of the above mentioned parameters is conducted for the two types of refractive beam expander, Keplerian and Galilean. Ideal refractive indices for the lenses are chosen adequately for Nd: YAG laser harmonics wavelengths, so that increasing transm
... Show MoreThe purpose of this study is to evaluate the hydraulic performance and efficiency of using direction diverting blocks, DDBs, fixed on the surface on an Ogee spillway in reducing the acceleration and dissipating the energy of the incoming supercritical flow. Fifteen types of DDB models were made from wood with a triangulate shape and different sizes were used. Investigation tests on pressure distribution at the DDBs boundaries were curried out to insure there is no negative pressures is developed that cause cavitation. In these tests, thirty six test runs were accomplished by using six types of blocks with the same size but differ in apex angle. Results of these test showed no negative pressures developed at the boundarie
... Show MoreThe holmium plasma induced by a 1064-nmQ-switched Nd:YAG laser in air was investigated. This work was done theoretically and experimentally. Cowan code was used to get the emission spectra for different transition of the holmium target. In the experimental work, the evolution of the plasma was studied by acquiring spectral images at different laser pulse energies (600,650,700, 750, and 800 mJ). The repetition rates of (1Hz and 10Hz) in the UV region (200-400 nm). The results indicate that, the emission line intensities increase with increasing of the laser pulse energy and repetition rate. The strongest emission spectra appeared when the laser pulse energy is 800mJ and 10 Hz repetition rate at λ= 345.64nm, with the maximum intensi
... Show MoreThis study proposes a hybrid predictive maintenance framework that integrates the Kolmogorov-Arnold Network (KAN) with Short-Time Fourier Transform (STFT) for intelligent fault diagnosis in industrial rotating machinery. The method is designed to address challenges posed by non-linear and non-stationary vibration signals under varying operational conditions. Experimental validation using the FALEX multispecimen test bench demonstrated a high classification accuracy of 97.5%, outperforming traditional models such as SVM, Random Forest, and XGBoost. The approach maintained robust performance across dynamic load scenarios and noisy environments, with precision and recall exceeding 95%. Key contributions include a hardware-accelerated K
... Show MoreNowadays, people's expression on the Internet is no longer limited to text, especially with the rise of the short video boom, leading to the emergence of a large number of modal data such as text, pictures, audio, and video. Compared to single mode data ,the multi-modal data always contains massive information. The mining process of multi-modal information can help computers to better understand human emotional characteristics. However, because the multi-modal data show obvious dynamic time series features, it is necessary to solve the dynamic correlation problem within a single mode and between different modes in the same application scene during the fusion process. To solve this problem, in this paper, a feature extraction framework of
... Show MoreHuman Interactive Proofs (HIPs) are automatic inverse Turing tests, which are intended to differentiate between people and malicious computer programs. The mission of making good HIP system is a challenging issue, since the resultant HIP must be secure against attacks and in the same time it must be practical for humans. Text-based HIPs is one of the most popular HIPs types. It exploits the capability of humans to recite text images more than Optical Character Recognition (OCR), but the current text-based HIPs are not well-matched with rapid development of computer vision techniques, since they are either vey simply passed or very hard to resolve, thus this motivate that
... Show MoreIn recent years, the number of applications utilizing mobile wireless sensor networks (WSNs) has increased, with the intent of localization for the purposes of monitoring and obtaining data from hazardous areas. Location of the event is very critical in WSN, as sensing data is almost meaningless without the location information. In this paper, two Monte Carlo based localization schemes termed MCL and MSL* are studied. MCL obtains its location through anchor nodes whereas MSL* uses both anchor nodes and normal nodes. The use of normal nodes would increase accuracy and reduce dependency on anchor nodes, but increases communication costs. For this reason, we introduce a new approach called low communication cost schemes to reduce communication
... Show MoreClassification of imbalanced data is an important issue. Many algorithms have been developed for classification, such as Back Propagation (BP) neural networks, decision tree, Bayesian networks etc., and have been used repeatedly in many fields. These algorithms speak of the problem of imbalanced data, where there are situations that belong to more classes than others. Imbalanced data result in poor performance and bias to a class without other classes. In this paper, we proposed three techniques based on the Over-Sampling (O.S.) technique for processing imbalanced dataset and redistributing it and converting it into balanced dataset. These techniques are (Improved Synthetic Minority Over-Sampling Technique (Improved SMOTE), Border
... Show More