Biosignal analysis is one of the most important topics that researchers have tried to develop during the last century to understand numerous human diseases. Electroencephalograms (EEGs) are one of the techniques which provides an electrical representation of biosignals that reflect changes in the activity of the human brain. Monitoring the levels of anesthesia is a very important subject, which has been proposed to avoid both patient awareness caused by inadequate dosage of anesthetic drugs and excessive use of anesthesia during surgery. This article reviews the bases of these techniques and their development within the last decades and provides a synopsis of the relevant methodologies and algorithms that are used to analyze EEG signals. In addition, it aims to present some of the physiological background of the EEG signal, developments in EEG signal processing, and the effective methods used to remove various types of noise. This review will hopefully increase efforts to develop methods that use EEG signals for determining and classifying the depth of anesthesia with a high data rate to produce a flexible and reliable detection device.
In this research was conducted to provide a product to analyze the performance sensor fiber optic used to measure and feel the intensity of the electric field results showed obtained that use sensor long gives reactive high electric field strength and a high value for allergic sensor, but that is at the expense of reducing the intensity of the electric field that is detected
Researchers often equate database accounting models in general and the Resources-Events-Agents (REA) accounting model in particular with events accounting as proposed by Sorter (1969). In fact, REA accounting, database accounting, and events accounting are very different. Because REA accounting has become a popular topic in AIS research, it is important to agree on exactly what is meant by certain ideas, both in concept and in historical origin. This article clarifies the analyzing framework of REA accounting model and highlights the differences between the terms events accounting, database accounting, semantically-modeled accounting, and REA accounting. It als
... Show MoreDesigning machines and equipment for post-harvest operations of agricultural products requires information about their physical properties. The aim of the work was to evaluate the possibility of introducing a new approach to predict the moisture content in bean and corn seeds based on measuring their dimensions using image analysis using artificial neural networks (ANN). Experimental tests were carried out at three levels of wet basis moisture content of seeds: 9, 13 and 17%. The analysis of the results showed a direct relationship between the wet basis moisture content and the main dimensions of the seeds. Based on the statistical analysis of the seed material, it was shown that the characteristics
News headlines are key elements in spreading news. They are unique texts written in a special language which enables readers understand the overall nature and importance of the topic. However, this special language causes difficulty for readers in understanding the headline. To illuminate this difficulty, it is argued that a pragmatic analysis from a speech act theory perspective is a plausible tool for a headline analysis. The main objective of the study is to pragmatically analyze the most frequently employed types of speech acts in the news headlines covering COVID-19 in Aljazeera English website. To this end, Bach and Harnish's (1979) Taxonomy of Speech Acts has been adopted to analyze the data. Thirty headlines have been collected f
... Show MoreIn this work, a local sunflower husk (SFH) was used as a natural surface for removing Basic Green-4 (BG4) dye, as a watersoluble pollutant. The effect of initial concentration, contact time, the mass of surface of the dye with the SFH as well as the medium temperature was studied. The application of Langmuir, Freundlich isotherms on the collected data of the adsorption process found to harmonize to Freundlich equation more than that of Langmuir. However, the adsorbed mass of BG4 dye showed a direct increase with the increase of SFH mass and equilibrium was achieved within a 60min window. The interaction of BG4 with SFH surface was spontaneous and exothermic. The empirical kinetic outcomes at ambient temperatures were applied to pseudo 1st a
... Show MoreSurvival analysis is widely applied to data that described by the length of time until the occurrence of an event under interest such as death or other important events. The purpose of this paper is to use the dynamic methodology which provides a flexible method, especially in the analysis of discrete survival time, to estimate the effect of covariate variables through time in the survival analysis on dialysis patients with kidney failure until death occurs. Where the estimations process is completely based on the Bayes approach by using two estimation methods: the maximum A Posterior (MAP) involved with Iteratively Weighted Kalman Filter Smoothing (IWKFS) and in combination with the Expectation Maximization (EM) algorithm. While the other
... Show MoreRock type identification is very important task in Reservoir characterization in order to constrict robust reservoir models. There are several approaches have been introduced to define the rock type in reservoirs and each approach should relate the geological and petrophysical properties, such that each rock type is proportional to a unique hydraulic flow unit. A hydraulic flow unit is a reservoir zone that is laterally and vertically has similar flow and bedding characteristics. According to effect of rock type in reservoir performance, many empirical and statistical approaches introduced. In this paper Cluster Analysis technique is used to identify the rock groups in tertiary reservoir for Khabaz oil field by analyses variation o
... Show MoreAbstract The wavelet shrink estimator is an attractive technique when estimating the nonparametric regression functions, but it is very sensitive in the case of a correlation in errors. In this research, a polynomial model of low degree was used for the purpose of addressing the boundary problem in the wavelet reduction in addition to using flexible threshold values in the case of Correlation in errors as it deals with those transactions at each level separately, unlike the comprehensive threshold values that deal with all levels simultaneously, as (Visushrink) methods, (False Discovery Rate) method, (Improvement Thresholding) and (Sureshrink method), as the study was conducted on real monthly data represented in the rates of theft crimes f
... Show More
