Heart disease is a significant and impactful health condition that ranks as the leading cause of death in many countries. In order to aid physicians in diagnosing cardiovascular diseases, clinical datasets are available for reference. However, with the rise of big data and medical datasets, it has become increasingly challenging for medical practitioners to accurately predict heart disease due to the abundance of unrelated and redundant features that hinder computational complexity and accuracy. As such, this study aims to identify the most discriminative features within high-dimensional datasets while minimizing complexity and improving accuracy through an Extra Tree feature selection based technique. The work study assesses the efficacy of several classification algorithms on four reputable datasets, using both the full features set and the reduced features subset selected through the proposed method. The results show that the feature selection technique achieves outstanding classification accuracy, precision, and recall, with an impressive 97% accuracy when used with the Extra Tree classifier algorithm. The research reveals the promising potential of the feature selection method for improving classifier accuracy by focusing on the most informative features and simultaneously decreasing computational burden.
Key generation for data cryptography is vital in wireless communications security. This key must be generated in a random way so that can not be regenerated by a third party other than the intended receiver. The random nature of the wireless channel is utilized to generate the encryption key. However, the randomness of wireless channels deteriorated over time due to channel aging which casing security threats, particularly for spatially correlated channels. In this paper, the effect of channel aging on the ciphering key generations is addressed. A proposed method to randomize the encryption key each coherence time is developed which decreases the correlation between keys generated at consecutive coherence times. When compared to the
... Show MoreImage quality plays a vital role in improving and assessing image compression performance. Image compression represents big image data to a new image with a smaller size suitable for storage and transmission. This paper aims to evaluate the implementation of the hybrid techniques-based tensor product mixed transform. Compression and quality metrics such as compression-ratio (CR), rate-distortion (RD), peak signal-to-noise ratio (PSNR), and Structural Content (SC) are utilized for evaluating the hybrid techniques. Then, a comparison between techniques is achieved according to these metrics to estimate the best technique. The main contribution is to improve the hybrid techniques. The proposed hybrid techniques are consisting of discrete wavel
... Show MoreWith the rapid development of computers and network technologies, the security of information in the internet becomes compromise and many threats may affect the integrity of such information. Many researches are focused theirs works on providing solution to this threat. Machine learning and data mining are widely used in anomaly-detection schemes to decide whether or not a malicious activity is taking place on a network. In this paper a hierarchical classification for anomaly based intrusion detection system is proposed. Two levels of features selection and classification are used. In the first level, the global feature vector for detection the basic attacks (DoS, U2R, R2L and Probe) is selected. In the second level, four local feature vect
... Show MoreThe researchers of the present study have conducted a genre analysis of two political debates between American presidential nominees in the 2016 and 2020 elections. The current study seeks to analyze the cognitive construction of political debates to evaluate the typical moves and strategies politicians use to express their communicative intentions and to reveal the language manifestations of those moves and strategies. To achieve the study’s aims, the researchers adopt Bhatia’s (1993) framework of cognitive construction supported by van Emeren’s (2010) pragma-dialectic framework. The study demonstrates that both presidents adhere to this genre structuring to further their political agendas. For a positive and promising image
... Show MoreIn modern technology, the ownership of electronic data is the key to securing their privacy and identity from any trace or interference. Therefore, a new identity management system called Digital Identity Management, implemented throughout recent years, acts as a holder of the identity data to maintain the holder’s privacy and prevent identity theft. Therefore, an overwhelming number of users have two major problems, users who own data and third-party applications will handle it, and users who have no ownership of their data. Maintaining these identities will be a challenge these days. This paper proposes a system that solves the problem using blockchain technology for Digital Identity Management systems. Blockchain is a powerful techniqu
... Show MoreCopula modeling is widely used in modern statistics. The boundary bias problem is one of the problems faced when estimating by nonparametric methods, as kernel estimators are the most common in nonparametric estimation. In this paper, the copula density function was estimated using the probit transformation nonparametric method in order to get rid of the boundary bias problem that the kernel estimators suffer from. Using simulation for three nonparametric methods to estimate the copula density function and we proposed a new method that is better than the rest of the methods by five types of copulas with different sample sizes and different levels of correlation between the copula variables and the different parameters for the function. The
... Show MoreFace Identification system is an active research area in these years. However, the accuracy and its dependency in real life systems are still questionable. Earlier research in face identification systems demonstrated that LBP based face recognition systems are preferred than others and give adequate accuracy. It is robust against illumination changes and considered as a high-speed algorithm. Performance metrics for such systems are calculated from time delay and accuracy. This paper introduces an improved face recognition system that is build using C++ programming language with the help of OpenCV library. Accuracy can be increased if a filter or combinations of filters are applied to the images. The accuracy increases from 95.5% (without ap
... Show MoreAbstract
Locally natural occurring Iraqi rocks of Bauxite and Porcelanite (after pre calcinations at 1000oC for 1hr) were used, with the addition of different proportions of MgO and Al2O3, to prepare refractory materials. The effects of these additives on the physical and thermal properties of the prepared refractories were investigated.
Many batches of Bauxite/MgO, Bauxite/Al2O3, Bauxite/MgO/Al2O3, and Porcelanite/ MgO/Al2O3 were prepared. The mixture is milled and classified into different size fractions; fine (less than 45μm) 40%, middle (45-75μm) 40%, and coarse (75-106μm) 20% .
... Show More