This work implements an Electroencephalogram (EEG) signal classifier. The implemented method uses Orthogonal Polynomials (OP) to convert the EEG signal samples to moments. A Sparse Filter (SF) reduces the number of converted moments to increase the classification accuracy. A Support Vector Machine (SVM) is used to classify the reduced moments between two classes. The proposed method’s performance is tested and compared with two methods by using two datasets. The datasets are divided into 80% for training and 20% for testing, with 5 -fold used for cross-validation. The results show that this method overcomes the accuracy of other methods. The proposed method’s best accuracy is 95.6% and 99.5%, respectively. Finally, from the results, it is obvious that the number of moments selected by the SP should exceed 30% of the overall EEG samples for accuracy to be over 90%.
Orthogonal polynomials and their moments have significant role in image processing and computer vision field. One of the polynomials is discrete Hahn polynomials (DHaPs), which are used for compression, and feature extraction. However, when the moment order becomes high, they suffer from numerical instability. This paper proposes a fast approach for computing the high orders DHaPs. This work takes advantage of the multithread for the calculation of Hahn polynomials coefficients. To take advantage of the available processing capabilities, independent calculations are divided among threads. The research provides a distribution method to achieve a more balanced processing burden among the threads. The proposed methods are tested for va
... Show Moren Segmented Optical Telescope (NGST) with hexagonal segment of spherical primary mirror can provide a 3 arc minutes field of view. Extremely Large Telescopes (ELT) in the 100m dimension would have such unprecedented scientific effectiveness that their construction would constitute a milestone comparable to that of the invention of the telescope itself and provide a truly revolutionary insight into the universe. The scientific case and the conceptual feasibility of giant filled aperture telescopes was our interested. Investigating the requirements of these imply for possible technical options in the case of a 100m telescope. For this telescope the considerable interest is the correction of the optical aberrations for the coming wavefront, th
... Show Moreorder to increase the level of security, as this system encrypts the secret image before sending it through the internet to the recipient (by the Blowfish method). As The Blowfish method is known for its efficient security; nevertheless, the encrypting time is long. In this research we try to apply the smoothing filter on the secret image which decreases its size and consequently the encrypting and decrypting time are decreased. The secret image is hidden after encrypting it into another image called the cover image, by the use of one of these two methods" Two-LSB" or" Hiding most bits in blue pixels". Eventually we compare the results of the two methods to determine which one is better to be used according to the PSNR measurs
The purpose of this work is to study the classification and construction of (k,3)-arcs in the projective plane PG(2,7). We found that there are two (5,3)-arcs, four (6,3)-arcs, six (7,3)arcs, six (8,3)-arcs, seven (9,3)-arcs, six (10,3)-arcs and six (11,3)-arcs. All of these arcs are incomplete. The number of distinct (12,3)-arcs are six, two of them are complete. There are four distinct (13,3)-arcs, two of them are complete and one (14,3)-arc which is incomplete. There exists one complete (15,3)-arc.
Several recent approaches focused on the developing of traditional systems to measure the costs to meet the new environmental requirements, including Attributes Based Costing (ABCII). It is method of accounting is based on measuring the costs according to the Attributes that the product is designed on this basis and according to achievement levels of all the Attribute of the product attributes. This research provides the knowledge foundations of this approach and its role in the market-oriented compared to the Activity based costing as shown in steps to be followed to apply for this Approach. The research problem in the attempt to reach the most accurate Approach in the measurement of the cost of products from th
... Show MoreMobile-based human emotion recognition is very challenging subject, most of the approaches suggested and built in this field utilized various contexts that can be derived from the external sensors and the smartphone, but these approaches suffer from different obstacles and challenges. The proposed system integrated human speech signal and heart rate, in one system, to leverage the accuracy of the human emotion recognition. The proposed system is designed to recognize four human emotions; angry, happy, sad and normal. In this system, the smartphone is used to record user speech and send it to a server. The smartwatch, fixed on user wrist, is used to measure user heart rate while the user is speaking and send it, via Bluetooth,
... Show MoreA new blind restoration algorithm is presented and shows high quality restoration. This
is done by enforcing Wiener filtering approach in the Fourier domains of the image and the
psf environments
The Enhanced Thematic Mapper Plus (ETM+) that loaded onboard the Landsat-7 satellite was launched on 15 April 1999. After 4 years, the image collected by this sensor was greatly impacted by the failure of the system’s Scan Line Corrector (SLC), a radiometry error.The median filter is one of the basic building blocks in many image processing situations. Digital images are often distorted by impulse noise due to errors generated by the noise sensor, errors that occur during the conversion of signals from analog-to-digital, as well as errors generated in communication channels. This error inevitably leads to a change in the intensity of some pixels, while some pixels remain unchanged. To remove impulse noise and improve the quality of the
... Show More