The virtual decomposition control (VDC) is an efficient tool suitable to deal with the full-dynamics-based control problem of complex robots. However, the regressor-based adaptive control used by VDC to control every subsystem and to estimate the unknown parameters demands specific knowledge about the system physics. Therefore, in this paper, we focus on reorganizing the equation of the VDC for a serial chain manipulator using the adaptive function approximation technique (FAT) without needing specific system physics. The dynamic matrices of the dynamic equation of every subsystem (e.g. link and joint) are approximated by orthogonal functions due to the minimum approximation errors produced. The control, the virtual stability of every subsystem and the stability of the entire robotic system are proved in this work. Then the computational complexity of the FAT is compared with the regressor-based approach. Despite the apparent advantage of the FAT in avoiding the regressor matrix, its computational complexity can result in difficulties in the implementation because of the representation of the dynamic matrices of the link subsystem by two large sparse matrices. In effect, the FAT-based adaptive VDC requires further work for improving the representation of the dynamic matrices of the target subsystem. Two case studies are simulated by Matlab/Simulink: a 2-R manipulator and a 6-DOF planar biped robot for verification purposes.
Steganography is defined as hiding confidential information in some other chosen media without leaving any clear evidence of changing the media's features. Most traditional hiding methods hide the message directly in the covered media like (text, image, audio, and video). Some hiding techniques leave a negative effect on the cover image, so sometimes the change in the carrier medium can be detected by human and machine. The purpose of suggesting hiding information is to make this change undetectable. The current research focuses on using complex method to prevent the detection of hiding information by human and machine based on spiral search method, the Structural Similarity Index Metrics measures are used to get the accuracy and quality
... Show MoreClassification of imbalanced data is an important issue. Many algorithms have been developed for classification, such as Back Propagation (BP) neural networks, decision tree, Bayesian networks etc., and have been used repeatedly in many fields. These algorithms speak of the problem of imbalanced data, where there are situations that belong to more classes than others. Imbalanced data result in poor performance and bias to a class without other classes. In this paper, we proposed three techniques based on the Over-Sampling (O.S.) technique for processing imbalanced dataset and redistributing it and converting it into balanced dataset. These techniques are (Improved Synthetic Minority Over-Sampling Technique (Improved SMOTE), Border
... Show MoreFree Space Optical (FSO) technology offers highly directional, high bandwidth communication channels. This technology can provide fiber-like data rate over short distances. In order to improve security associated with data transmission in FSO networks, a secure communication method based on chaotic technique is presented. In this paper, we have turned our focus on a specific class of piece wise linear one-dimensional chaotic maps. Simulation results indicate that this approach has the advantage of possessing excellent correlation property. In this paper we examine the security vulnerabilities of single FSO links and propose a solution to this problem by implementing the chaotic signal generator “reconfigurable tent map”. As synchronizat
... Show MoreAbstract
The Phenomenon of Extremism of Values (Maximum or Rare Value) an important phenomenon is the use of two techniques of sampling techniques to deal with this Extremism: the technique of the peak sample and the maximum annual sampling technique (AM) (Extreme values, Gumbel) for sample (AM) and (general Pareto, exponential) distribution of the POT sample. The cross-entropy algorithm was applied in two of its methods to the first estimate using the statistical order and the second using the statistical order and likelihood ratio. The third method is proposed by the researcher. The MSE comparison coefficient of the estimated parameters and the probability density function for each of the distributions were
... Show MoreThis paper studies the adaptive coded modulation for coded OFDM system using punctured convolutional code, channel estimation, equalization and SNR estimation. The channel estimation based on block type pilot arrangement is performed by sending pilots at every sub carrier and using this estimation for a specific number of following symbols. Signal to noise ratio is estimated at receiver and then transmitted to the transmitter through feedback channel ,the transmitter according to the estimated SNR select appropriate modulation scheme and coding rate which maintain constant bit error rate
lower than the requested BER. Simulation results show that better performance is confirmed for target bit error rate (BER) of (10-3) as compared to c
This paper proposes a new password generation technique on the basis of mouse motion and a special case location recognized by the number of clicks to protect sensitive data for different companies. Two, three special locations click points for the users has been proposed to increase password complexity. Unlike other currently available random password generators, the path and number of clicks will be added by admin, and authorized users have to be training on it.
This method aims to increase combinations for the graphical password generation using mouse motion for a limited number of users. A mathematical model is developed to calculate the performance
A new technique for embedding image data into another BMP image data is presented. The image data to be embedded is referred to as signature image, while the image into which the signature image is embedded is referred as host image. The host and the signature images are first partitioned into 8x8 blocks, discrete cosine transformed “DCT”, only significant coefficients are retained, the retained coefficients then inserted in the transformed block in a forward and backward zigzag scan direction. The result then inversely transformed and presented as a BMP image file. The peak signal-to-noise ratio (PSNR) is exploited to evaluate the objective visual quality of the host image compared with the original image.
In this research we will present the signature as a key to the biometric authentication technique. I shall use moment invariants as a tool to make a decision about any signature which is belonging to the certain person or not. Eighteen voluntaries give 108 signatures as a sample to test the proposed system, six samples belong to each person were taken. Moment invariants are used to build a feature vector stored in this system. Euclidean distance measure used to compute the distance between the specific signatures of persons saved in this system and with new sample acquired to same persons for making decision about the new signature. Each signature is acquired by scanner in jpg format with 300DPI. Matlab used to implement this system.