Automatic speaker recognition may achieve remarkable performance in matched training and test conditions. Conversely, results drop significantly in incompatible noisy conditions. Furthermore, feature extraction significantly affects performance. Mel-frequency cepstral coefficients MFCCs are most commonly used in this field of study. The literature has reported that the conditions for training and testing are highly correlated. Taken together, these facts support strong recommendations for using MFCC features in similar environmental conditions (train/test) for speaker recognition. However, with noise and reverberation present, MFCC performance is not reliable. To address this, we propose a new feature 'entrocy' for accurate and robust speaker recognition, which we mainly employ to support MFCC coefficients in noisy environments. Entrocy is the fourier transform of the entropy, a measure of the fluctuation of the information in sound segments over time. Entrocy features are combined with MFCCs to generate a composite feature set which is tested using the gaussian mixture model (GMM) speaker recognition method. The proposed method shows improved recognition accuracy over a range of signal-to-noise ratios.
Signature verification involves vague situations in which a signature could resemble many reference samples or might differ because of handwriting variances. By presenting the features and similarity score of signatures from the matching algorithm as fuzzy sets and capturing the degrees of membership, non-membership, and indeterminacy, a neutrosophic engine can significantly contribute to signature verification by addressing the inherent uncertainties and ambiguities present in signatures. But type-1 neutrosophic logic gives these membership functions fixed values, which could not adequately capture the various degrees of uncertainty in the characteristics of signatures. Type-1 neutrosophic representation is also unable to adjust to various
... Show MoreAbstract
The multiple linear regression model of the important regression models used in the analysis for different fields of science Such as business, economics, medicine and social sciences high in data has undesirable effects on analysis results . The multicollinearity is a major problem in multiple linear regression. In its simplest state, it leads to the departure of the model parameter that is capable of its scientific properties, Also there is an important problem in regression analysis is the presence of high leverage points in the data have undesirable effects on the results of the analysis , In this research , we present some of
... Show MoreAbstract
The research Compared two methods for estimating fourparametersof the compound exponential Weibull - Poisson distribution which are the maximum likelihood method and the Downhill Simplex algorithm. Depending on two data cases, the first one assumed the original data (Non-polluting), while the second one assumeddata contamination. Simulation experimentswere conducted for different sample sizes and initial values of parameters and under different levels of contamination. Downhill Simplex algorithm was found to be the best method for in the estimation of the parameters, the probability function and the reliability function of the compound distribution in cases of natural and contaminateddata.
... Show More
Many researchers have tackled the shear behavior of Reinforced Concrete (RC) beams by using different kinds of strengthening in the shear regions and steel fibers. In the current paper, the effect of multiple parameters, such as using one percentage of Steel Fibers (SF) with and without stirrups, without stirrups and steel fibers, on the shear behavior of RC beams, has been studied and compared by using Finite Element analysis (FE). Three-dimensional (3D) models of (RC) beams are developed and analyzed using ABAQUS commercial software. The models were validated by comparing their results with the experimental test. The total number of beams that were modeled for validation purposes was four. Extensive pa
... Show MoreThis research represents a practical attempt applied to calibrate and verify a hydraulic model for the Blue Nile River. The calibration procedures are performed using the observed data for a previous period and comparing them with the calibration results while verification requirements are achieved with the application of the observed data for another future period and comparing them with the verification results. The study objective covered a relationship of the river terrain with the distance between the assumed points of the dam failures along the river length. The computed model values and the observed data should conform to the theoretical analysis and the overall verification performance of the model by comparing it with anothe
... Show MoreThis research represents a practical attempt applied to calibrate and verify a hydraulic model for the Blue Nile River. The calibration procedures are performed using the observed data for a previous period and comparing them with the calibration results while verification requirements are achieved with the application of the observed data for another future period and comparing them with the verification results. The study objective covered a relationship of the river terrain with the distance between the assumed points of the dam failures along the river length. The computed model values and the observed data should conform to the theoretical analysis and the overall verification performance of the model by comparing i
... Show MoreThis paper presents a robust control method for the trajectory control of the robotic manipulator. The standard Computed Torque Control (CTC) is an important method in the robotic control systems but its not robust to system uncertainty and external disturbance. The proposed method overcome the system uncertainty and external disturbance problems. In this paper, a robustification term has been added to the standard CTC. The stability of the proposed control method is approved by the Lyapunov stability theorem. The performance of the presented controller is tested by MATLAB-Simulink environment and is compared with different control methods to illustrate its robustness and performance.
Biomarkers to detect Alzheimer’s disease (AD) would enable patients to gain access to appropriate services and may facilitate the development of new therapies. Given the large numbers of people affected by AD, there is a need for a low-cost, easy to use method to detect AD patients. Potentially, the electroencephalogram (EEG) can play a valuable role in this, but at present no single EEG biomarker is robust enough for use in practice. This study aims to provide a methodological framework for the development of robust EEG biomarkers to detect AD with a clinically acceptable performance by exploiting the combined strengths of key biomarkers. A large number of existing and novel EEG biomarkers associated with slowing of EEG, reductio
... Show MoreTest method was developed radioimmunotherapy to appoint in two groups of patients infected with a uterine tumor Great conditions in tumor tissue benign and malignant Ddh teacher radioactive iodine isotope