Storing, transferring, and processing high-dimensional electroencephalogram (EGG) signals is a critical challenge. The goal of EEG compression is to remove redundant data in EEG signals. Medical signals like EEG must be of high quality for medical diagnosis. This paper uses a compression system with near-zero Mean Squared Error (MSE) based on Discrete Cosine Transform (DCT) and double shift coding for fast and efficient EEG data compression. This paper investigates and compares the use or non-use of delta modulation, which is applied to the transformed and quantized input signal. Double shift coding is applied after mapping the output to positive as a final step. The system performance is tested using EEG data files from the CHB-MIT Scalp EEG Database. Compression Ratio (CR) is used to evaluate the compression system performance. The results are encouraging when compared with previous works on the same data samples.
Air pollution is one of the important problems facing Iraq. Air pollution is the result of uncontrolled emissions from factories, car exhaust electric generators, and oil refineries and often reaches unacceptable limits by international standards. These pollutants can greatly affect human health and regular population activities. For this reason, there is an urgent need for effective devices to monitor the molecular concentration of air pollutants in cities and urban areas. In this research, an optical system has been built consisting of aHelium-Neonlaser,5mWand at 632.8 nm, a glass cell with a defined size, and a power meter(Gentec-E-model: uno) where a scattering of the laser beam occurs due to air pollution. Two pollutants were examin
... Show MoreIn this paper, we investigate some methods to solve one of the multi-criteria machine scheduling problems. The discussed problem is the total completion time and the total earliness jobs To solve this problem, some heuristic methods are proposed which provided good results. The Branch and Bound (BAB) method is applied with new suggested upper and lower bounds to solve the discussed problem, which produced exact results for in a reasonable time.
The key objective of the study is to understand the best processes that are currently used in managing talent in Australian higher education (AHE) and design a quantitative measurement of talent management processes (TMPs) for the higher education (HE) sector.
The three qualitative multi-method studies that are commonly used in empirical studies, namely, brainstorming, focus group discussions and semi-structured individual interviews were considered. Twenty
This study aimed at some of the criteria used to determine the form of the river basins, and exposed the need to modify some of its limitations. In which, the generalization of the elongation and roundness ratio coefficient criterion was modified, which was set in a range between (0-1). This range goes beyond determining the form of the basin, which gives it an elongated or rounded feature, and the ratio has been modified by making it more detailed and accurate in giving the basin a specific form, not only a general characteristic. So, we reached a standard for each of the basins' forms regarding the results of the elongation and circularity ratios. Thus, circular is (1-0.8), and square is (between 0.8-0.6), the blade or oval form is (0.6-0
... Show MoreIn this paper, various aspects of smart grids are described. These aspects include the components of smart grids, the detailed functions of the smart energy meters within the smart grids and their effects on increasing the awareness, the advantages and disadvantages of smart grids, and the requirements of utilizing smart grids. To put some light on the difference between smart grids and traditional utility grids, some aspects of the traditional utility grids are covered in this paper as well.
Multi-agent systems are subfield of Artificial Intelligence that has experienced rapid growth because of its flexibility and intelligence in order to solve distributed problems. Multi-agent systems (MAS) have got interest from various researchers in different disciplines for solving sophisticated problems by dividing them into smaller tasks. These tasks can be assigned to agents as autonomous entities with their private database, which act on their environment, perceive, process, retain and recall by using multiple inputs. MAS can be defined as a network of individual agents that share knowledge and communicate with each other in order to solve a problem that is beyond the scope of a single agent. It is imperative to understand the chara
... Show MoreWe used to think of grammar as the bones of the language and vocabulary as the flesh to be added given that language consisted largely of life generated chunks of lexis. This “skeleton image” has been proverbially used to refer to that central feature of lexis named collocation- an idea that for the first 15 years of language study and analysis gave a moment‟s thought to English classroom material and methodology.
The work of John Sinclair, Dave Willis, Ron Carter, Michael McCarthy, Michael Lewis, and many others have all contributed to the way teachers today approach the area of lexis and what it means in the teaching/learning process of the language. This also seems to have incorporated lexical ideas into the teaching mechanis
Praise be to God, Lord of the worlds, and the best of prayers, and extradition was made to our master Muhammad and his whole family and companions:
And after:
Surat Al-Qariah is from the royal fence, which is in the thirtieth part of the Noble Qur’an. The commentators differed on the number of verses, and the reason for its naming. This surah talked about the conditions of people on the Day of Resurrection, and about how people were removed from their graves, all of which I spoke about in the preamble.
The research is divided into six topics: the first topic dealt with strange terms, while the second topic dealt with the most important audio issues mentioned in the surah, namely (slurring, advertising, substitution, and silenc
Genetic Algorithms (GA) is a based population approach. It belongs to a metaheuristic procedure that uses population characteristics to guide the search. It maintains and improves multiple solutions which may produce a high-quality solution to an optimization problem. This study presents a comprehensive survey of the GA. We provide and discuss genetic algorithms for new researchers. We illustrate which components build up the GAs and view the main results on complexity time.
A new distribution, the Epsilon Skew Gamma (ESΓ ) distribution, which was first introduced by Abdulah [1], is used on a near Gamma data. We first redefine the ESΓ distribution, its properties, and characteristics, and then we estimate its parameters using the maximum likelihood and moment estimators. We finally use these estimators to fit the data with the ESΓ distribution