Emotion recognition has important applications in human-computer interaction. Various sources such as facial expressions and speech have been considered for interpreting human emotions. The aim of this paper is to develop an emotion recognition system from facial expressions and speech using a hybrid of machine-learning algorithms in order to enhance the overall performance of human computer communication. For facial emotion recognition, a deep convolutional neural network is used for feature extraction and classification, whereas for speech emotion recognition, the zero-crossing rate, mean, standard deviation and mel frequency cepstral coefficient features are extracted. The extracted features are then fed to a random forest classifier. In addition, a bi-modal system for recognising emotions from facial expressions and speech signals is presented. This is important since one modality may not provide sufficient information or may not be available for any reason beyond operator control. To perform this, decision-level fusion is performed using a novel way for weighting according to the proportions of facial and speech impressions. The results show an average accuracy of 93.22 %.
For the most reliable and reproducible results for calibration or general testing purposes of two immiscible liquids, such as water in engine oil, good emulsification is vital. This study explores the impact of emulsion quality on the Fourier transform infrared (FT-IR) spectroscopy calibration standards for measuring water contamination in used or in-service engine oil, in an attempt to strengthen the specific guidelines of ASTM International standards for sample preparation. By using different emulsification techniques and readily available laboratory equipment, this work is an attempt to establish the ideal sample preparation technique for reliability, repeatability, and reproducibility for FT-IR analysis while still considering t
... Show MoreIn the image processing’s field and computer vision it’s important to represent the image by its information. Image information comes from the image’s features that extracted from it using feature detection/extraction techniques and features description. Features in computer vision define informative data. For human eye its perfect to extract information from raw image, but computer cannot recognize image information. This is why various feature extraction techniques have been presented and progressed rapidly. This paper presents a general overview of the feature extraction categories for image.
Sphingolipids are key components of eukaryotic membranes, particularly the plasma membrane. The biosynthetic pathway for the formation of these lipid species is largely conserved. However, in contrast to mammals, which produce sphingomyelin, organisms such as the pathogenic fungi and protozoa synthesize inositol phosphorylceramide (IPC) as the primary phosphosphingolipid. The key step involves the reaction of ceramide and phosphatidylinositol catalysed by IPC synthase, an essential enzyme with no mammalian equivalent encoded by the AUR1 gene in yeast and recently identified functional orthologues in the pathogenic kinetoplastid protozoa. As such this enzyme represents a promising target for novel anti-fungal and anti-protozoal drugs. Given
... Show MoreWireless sensor applications are susceptible to energy constraints. Most of the energy is consumed in communication between wireless nodes. Clustering and data aggregation are the two widely used strategies for reducing energy usage and increasing the lifetime of wireless sensor networks. In target tracking applications, large amount of redundant data is produced regularly. Hence, deployment of effective data aggregation schemes is vital to eliminate data redundancy. This work aims to conduct a comparative study of various research approaches that employ clustering techniques for efficiently aggregating data in target tracking applications as selection of an appropriate clustering algorithm may reflect positive results in the data aggregati
... Show Morel
Big data of different types, such as texts and images, are rapidly generated from the internet and other applications. Dealing with this data using traditional methods is not practical since it is available in various sizes, types, and processing speed requirements. Therefore, data analytics has become an important tool because only meaningful information is analyzed and extracted, which makes it essential for big data applications to analyze and extract useful information. This paper presents several innovative methods that use data analytics techniques to improve the analysis process and data management. Furthermore, this paper discusses how the revolution of data analytics based on artificial intelligence algorithms might provide
... Show MoreIn this work a hybrid composite materials were prepared containing matrix of polymer (polyethylene PE) reinforced by different reinforcing materials (Alumina powder + Carbon black powder CB + Silica powder). The hybrid composite materials prepared are: • H1 = PE + Al2O3 + CB • H2 = PE + CB + SiO2 • H3 = PE + Al2O3 + CB + SiO2 All samples related to electrical tests were prepared by injection molding process. Mechanical tests include compression with different temperatures and different chemical solutions at different immersion times The mechanical experimentations results were in favour of the samples (H3) with an obvious weakness of the samples (H1) and a decrease of these properties with a rise in temperature and the increasing
... Show More. In recent years, Bitcoin has become the most widely used blockchain platform in business and finance. The goal of this work is to find a viable prediction model that incorporates and perhaps improves on a combination of available models. Among the techniques utilized in this paper are exponential smoothing, ARIMA, artificial neural networks (ANNs) models, and prediction combination models. The study's most obvious discovery is that artificial intelligence models improve the results of compound prediction models. The second key discovery was that a strong combination forecasting model that responds to the multiple fluctuations that occur in the bitcoin time series and Error improvement should be used. Based on the results, the prediction a
... Show MoreIn recent years, Bitcoin has become the most widely used blockchain platform in business and finance. The goal of this work is to find a viable prediction model that incorporates and perhaps improves on a combination of available models. Among the techniques utilized in this paper are exponential smoothing, ARIMA, artificial neural networks (ANNs) models, and prediction combination models. The study's most obvious discovery is that artificial intelligence models improve the results of compound prediction models. The second key discovery was that a strong combination forecasting model that responds to the multiple fluctuations that occur in the bitcoin time series and Error improvement should be used. Based on the results, the prediction acc
... Show More