Proxy-based sliding mode control PSMC is an improved version of PID control that combines the features of PID and sliding mode control SMC with continuously dynamic behaviour. However, the stability of the control architecture maybe not well addressed. Consequently, this work is focused on modification of the original version of the proxy-based sliding mode control PSMC by adding an adaptive approximation compensator AAC term for vibration control of an Euler-Bernoulli beam. The role of the AAC term is to compensate for unmodelled dynamics and make the stability proof more easily. The stability of the proposed control algorithm is systematically proved using Lyapunov theory. Multi-modal equation of motion is derived using the Galerkin method. The state variables of the multi-modal equation are expressed in terms of modal amplitudes that should be regulated via the proposed control system. The proposed control structure is implemented on a simply supported beam with two piezo-patches. The simulation experiments are performed using MATLAB/SIMULINK package. The locations of piezo-transducers are optimally placed on the beam. A detailed comparison study is implemented including three scenarios. Scenario 1 includes disturbing the smart beam while no feedback loop is established (open-loop system). In scenario 2, a PD controller is applied on the vibrating beam. Whereas, scenario 3 includes implementation of the PSMC+AAC. For all previously mentioned scenarios, two types of disturbances are applied separately: 1) an impulse force of 1 N peak and 1 s pulse width, and 2) a sinusoidal disturbance with 0.5 N amplitude and 20 Hz frequency. For impulse disturbance signals, the results show the superiority of the PSMC+AAC in comparison with the conventional PD control. Whereas, both the PSMC+ACC and the PD control work well in the case of a sinusoidal disturbance signal and the superiority of the PSMC is not clear.
beef and chicken meat were used to get Sarcoplasim, the chicken Sarcoplasim were used to prepare antibody for it after injected in rabbit, the antiserums activity were 1/32 by determined with Immune double diffusion test, the self test refer to abele for some antiserums to detected with beef sarcoplasim, which it mean found same proteins be between beef and chicken meat, which it refer to difficult depended on this immune method to detect for cheat of chicken meat with beef, so the antibody for beef sarcoplasim were removed from serum by immune absorption step to produce specific serum against chicken sarcoplasim that it used in Immune double diffusion test to qualitative detect for cheat beef with 5% chicken meat or more at least, and the
... Show MoreThe aim of the study is to assess the risk factors which lead to myocardial infarction and relation to some variables. The filed study was carried out from the 1st of April to the end of Sept. 2005. The Sample of the study consisted of (100) patients in lbn-Albeetar and Baghdad Teaching Hospital. The result of the study indicated the following; 45% of patients with age group (41-50) were more exposed to the disease and there is no significant difference was seen in the level of education, Martial status, weight and height. The result shows that there are significant difference in risk factors like hypertension, cholesterol level in blood and diabetes. When analyzed by T.test at level of P < 0.01 and there are significant difference in smoki
... Show MoreThe analysis of survival and reliability considered of topics and methods of vital statistics at the present time because of their importance in the various demographical, medical, industrial and engineering fields. This research focused generate random data for samples from the probability distribution Generalized Gamma: GG, known as: "Inverse Transformation" Method: ITM, which includes the distribution cycle integration function incomplete Gamma integration making it more difficult classical estimation so will be the need to illustration to the method of numerical approximation and then appreciation of the function of survival function. It was estimated survival function by simulation the way "Monte Carlo". The Entropy method used for the
... Show MoreAs a result of the significance of image compression in reducing the volume of data, the requirement for this compression permanently necessary; therefore, will be transferred more quickly using the communication channels and kept in less space in memory. In this study, an efficient compression system is suggested; it depends on using transform coding (Discrete Cosine Transform or bi-orthogonal (tap-9/7) wavelet transform) and LZW compression technique. The suggested scheme was applied to color and gray models then the transform coding is applied to decompose each color and gray sub-band individually. The quantization process is performed followed by LZW coding to compress the images. The suggested system was applied on a set of seven stand
... Show MoreAbstract
In this research will be treated with a healthy phenomenon has a significant impact on different age groups in the community, but a phenomenon tonsillitis where they will be first Tawfiq model slope self moving averages seasonal ARMA Seasonal through systematic Xbox Cengnzla counter with rheumatoid tonsils in the city of Mosul, and for the period 2004-2009 with prediction of these numbers coming twelve months, has found that the specimen is the best representation of the data model is the phenomenon SARMA (1,1) * (2,1) 12 from the other side and explanatory variables using a maximum temperature and minimum temperature, sol
In this study, we made a comparison between LASSO & SCAD methods, which are two special methods for dealing with models in partial quantile regression. (Nadaraya & Watson Kernel) was used to estimate the non-parametric part ;in addition, the rule of thumb method was used to estimate the smoothing bandwidth (h). Penalty methods proved to be efficient in estimating the regression coefficients, but the SCAD method according to the mean squared error criterion (MSE) was the best after estimating the missing data using the mean imputation method
Image classification is the process of finding common features in images from various classes and applying them to categorize and label them. The main problem of the image classification process is the abundance of images, the high complexity of the data, and the shortage of labeled data, presenting the key obstacles in image classification. The cornerstone of image classification is evaluating the convolutional features retrieved from deep learning models and training them with machine learning classifiers. This study proposes a new approach of “hybrid learning” by combining deep learning with machine learning for image classification based on convolutional feature extraction using the VGG-16 deep learning model and seven class
... Show MoreA hand gesture recognition system provides a robust and innovative solution to nonverbal communication through human–computer interaction. Deep learning models have excellent potential for usage in recognition applications. To overcome related issues, most previous studies have proposed new model architectures or have fine-tuned pre-trained models. Furthermore, these studies relied on one standard dataset for both training and testing. Thus, the accuracy of these studies is reasonable. Unlike these works, the current study investigates two deep learning models with intermediate layers to recognize static hand gesture images. Both models were tested on different datasets, adjusted to suit the dataset, and then trained under different m
... Show More