As they are the smallest functional parts of the muscle, motor units (MUs) are considered as the basic building blocks of the neuromuscular system. Monitoring MU recruitment, de-recruitment, and firing rate (by either invasive or surface techniques) leads to the understanding of motor control strategies and of their pathological alterations. EMG signal decomposition is the process of identification and classification of individual motor unit action potentials (MUAPs) in the interference pattern detected with either intramuscular or surface electrodes. Signal processing techniques were used in EMG signal decomposition to understand fundamental and physiological issues. Many techniques have been developed to decompose intramuscularly detected signals with various degrees of automation. This paper investigates the application of autocorrelation function (ACF) method to decompose EMG signals to their frequency components. It was found that using the proposed method gives a quite good frequency resolution as compared to that resulting from using short time fast Fourier transform (STFFT); thus more MU’s can be distinguished.
This research deals with unusual approach for analyzing the Simple Linear Regression via Linear Programming by Two - phase method, which is known in Operations Research: “O.R.”. The estimation here is found by solving optimization problem when adding artificial variables: Ri. Another method to analyze the Simple Linear Regression is introduced in this research, where the conditional Median of (y) was taken under consideration by minimizing the Sum of Absolute Residuals instead of finding the conditional Mean of (y) which depends on minimizing the Sum of Squared Residuals, that is called: “Median Regression”. Also, an Iterative Reweighted Least Squared based on the Absolute Residuals as weights is performed here as another method to
... Show MoreSurvival analysis is the analysis of data that are in the form of times from the origin of time until the occurrence of the end event, and in medical research, the origin of time is the date of registration of the individual or the patient in a study such as clinical trials to compare two types of medicine or more if the endpoint It is the death of the patient or the disappearance of the individual. The data resulting from this process is called survival times. But if the end is not death, the resulting data is called time data until the event. That is, survival analysis is one of the statistical steps and procedures for analyzing data when the adopted variable is time to event and time. It could be d
... Show MoreThe aim of this study is to estimate the parameters and reliability function for kumaraswamy distribution of this two positive parameter (a,b > 0), which is a continuous probability that has many characterstics with the beta distribution with extra advantages.
The shape of the function for this distribution and the most important characterstics are explained and estimated the two parameter (a,b) and the reliability function for this distribution by using the maximum likelihood method (MLE) and Bayes methods. simulation experiments are conducts to explain the behaviour of the estimation methods for different sizes depending on the mean squared error criterion the results show that the Bayes is bet
... Show MoreAbstract: Microfluidic devices present unique advantages for the development of efficient drug assay and screening. The microfluidic platforms might offer a more rapid and cost-effective alternative. Fluids are confined in devices that have a significant dimension on the micrometer scale. Due to this extreme confinement, the volumes used for drug assays are tiny (milliliters to femtoliters).
In this research, a microfluidic chip consists of micro-channels carved on substrate materials built by using Acrylic (Polymethyl Methacrylate, PMMA) chip was designed using a Carbon Dioxide (CO2) laser machine. The CO2 parameters have influence on the width, depth, roughness of the chip. In order to have regular
... Show MoreA crucial area of research in nanotechnology is the formation of environmentally benign nanoparticles. Both unicellular and multicellular play an important role in synthesis nanoparticles through the production of inorganic materials either intracellularly or extracellularly. The agents (pigments, siderophores, cell extracted metabolites and reducing compounds) were used to prepare silver nanparticles with different sizes and shapes. The color variations (dark yellow, slightly dark yellow and golden yellow) arising from changes in the composition, size, and shape of nanoparticles, surrounding medium can be monitored using UV-visible spectrophotometer. These effects are due to the phenomena called surface plasmon resonance. The silver nanopa
... Show MoreIn this research the natural frequency of a cracked simple supported beam (the crack is in many places and in different depths) is investigated analytically, experimentally and numerically by ANSYS program, and the results are compared. The beam is made of iron with dimensions of L*W*H= (0.84*0.02* 0.02m), and density = 7680kg/m3, E=200Gpa. A comparison made between analytical results from ANSYS with experimental results, where the biggest error percentage is about (7.2 %) in crack position (42 cm) and (6 mm) depth. Between Rayleigh method with experimental results the biggest error percentage is about (6.4 %) for the same crack position and depth. From the error percentages it could be concluded that the Rayleigh method gives
... Show MoreIn practical engineering problems, uncertainty exists not only in external excitations but also in structural parameters. This study investigates the influence of structural geometry, elastic modulus, mass density, and section dimension uncertainty on the stochastic earthquake response of portal frames subjected to random ground motions. The North-South component of the El Centro earthquake in 1940 in California is selected as the ground excitation. Using the power spectral density function, the two-dimensional finite element model of the portal frame’s base motion is modified to account for random ground motions. A probabilistic study of the portal frame structure using stochastic finite elements utilizing Monte Carlo simulation
... Show MoreThe bit record is a part from the daily drilling report which is contain information about the type and the number of the bit that is used to drill the well, also contain data about the used weight on bit WOB ,revolution per minute RPM , rate of penetration ROP, pump pressure ,footage drilled and bit dull grade. Generally we can say that the bit record is a rich brief about the bit life in the hole. The main purpose of this research is to select the suitable bit to drill the next oil wells because the right bit selection avoid us more than one problems, on the other hand, the wrong bit selection cause more than one problem. Many methods are related to bit selection, this research is familiar with four of thos
... Show Morehave suffered from deteriorating residential neighborhoods, poor economic, social and urban living conditions of the population and deteriorating the infrastructural and superior services. These problems were the secretions of these cities' rapid urbanization. Based on the principles of sustainable urban planning and in order to achieve adequate opportunities for the lives of the population and provide them with sustainable livelihoods, policies have emerged to upgrade along the lines of community participation and programmes to reform and develop those neighbourhoods, raise their efficiency and make them livable. Thus, the problem of research was identified "The absence of a comprehensive cognitive perception of the most prominent facto
... Show MoreIn this paper, the bi-criteria machine scheduling problems (BMSP) are solved, where the discussed problem is represented by the sum of completion and the sum of late work times simultaneously. In order to solve the suggested BMSP, some metaheurisitc methods are suggested which produce good results. The suggested local search methods are simulated annulling and bees algorithm. The results of the new metaheurisitc methods are compared with the complete enumeration method, which is considered an exact method, then compared results of the heuristics with each other to obtain the most efficient method.