Database is characterized as an arrangement of data that is sorted out and disseminated in a way that allows the client to get to the data being put away in a simple and more helpful way. However, in the era of big-data the traditional methods of data analytics may not be able to manage and process the large amount of data. In order to develop an efficient way of handling big-data, this work studies the use of Map-Reduce technique to handle big-data distributed on the cloud. This approach was evaluated using Hadoop server and applied on EEG Big-data as a case study. The proposed approach showed clear enhancement for managing and processing the EEG Big-data with average of 50% reduction on response time. The obtained results provide EEG researchers and specialist with an easy and fast method of handling the EEG big data.
The aim of the research is to identify both the re-engineering of management processes and the strategic decision-making process in the research community and determine the nature of the correlation between the two variables and know the relationship between them to achieve the research goal. The researcher used a descriptive and analytical method. The research community consists of a group of professors and staff of the College of Education affiliated to the University of Mustansiriya in Baghdad, which their number were (45), the researcher has distributed the forms to all members of the sample, only (3) forms were excluded for invalidity and thus the number of forms approved in the analysis were (42) forms. The rese
... Show MoreTirzepatide is a revolutionary and promising medication with a high impact in the treatment of Obesity and T2DM with their complications. Its efficacy was proven through different trials in achieving favorable weight loss and a significant reduction in glycemic index. It also treated a large diversity of related co-morbidities, including fatty liver, cardiovascular disease, dyslipidemia, and more. Tirzepatide is well tolerated, has a good safety profile, and is highly reliable and suitable for use in a population.
The research aims to test the correlation relationships between wise leadership, and the business model, and to demonstrate its influence (wise leadership) on the business model. Therefore, two main hypotheses were put forward. The research was applied to a sample consisting of (87) managers and deputy managers, some of them are members on the board of directors of the researched company. The questionnaire was adopted as a basic tool for collecting data, in addition to personal interviews. A number of statistical methods were used to analyze the data and obtain results. The researcher showed a set of results, including: There is a significant correlation between wise leadership and business model. There is an influence of wise leadership on
... Show MoreAn oil spill is a leakage of pipelines, vessels, oil rigs, or tankers that leads to the release of petroleum products into the marine environment or on land that happened naturally or due to human action, which resulted in severe damages and financial loss. Satellite imagery is one of the powerful tools currently utilized for capturing and getting vital information from the Earth's surface. But the complexity and the vast amount of data make it challenging and time-consuming for humans to process. However, with the advancement of deep learning techniques, the processes are now computerized for finding vital information using real-time satellite images. This paper applied three deep-learning algorithms for satellite image classification
... Show MoreThe researcher attempts to diagnose the level of the effect of strategic thinking skills (intuition, meditation, creativity) of the managers in the Ministry of Health in Iraq and some of its institutions in the formulation of human resources management: (selection, training, incentivence , performance appraisal (Recognizing the importance of the subjects studied, and because of the importance of the expected results of th
... Show MoreRecurrent strokes can be devastating, often resulting in severe disability or death. However, nearly 90% of the causes of recurrent stroke are modifiable, which means recurrent strokes can be averted by controlling risk factors, which are mainly behavioral and metabolic in nature. Thus, it shows that from the previous works that recurrent stroke prediction model could help in minimizing the possibility of getting recurrent stroke. Previous works have shown promising results in predicting first-time stroke cases with machine learning approaches. However, there are limited works on recurrent stroke prediction using machine learning methods. Hence, this work is proposed to perform an empirical analysis and to investigate machine learning al
... Show MoreThe purpose of the current investigation is to distinguish between working memory ( ) in five patients with vascular dementia ( ), fifteen post-stroke patients with mild cognitive impairment ( ), and fifteen healthy control individuals ( ) based on background electroencephalography (EEG) activity. The elimination of EEG artifacts using wavelet (WT) pre-processing denoising is demonstrated in this study. In the current study, spectral entropy ( ), permutation entropy ( ), and approximation entropy ( ) were all explored. To improve the classification using the k-nearest neighbors ( NN) classifier scheme, a comparative study of using fuzzy neighbourhood preserving analysis with -decomposition ( ) as a dimensionality reduction technique an
... Show MoreModern civilization increasingly relies on sustainable and eco-friendly data centers as the core hubs of intelligent computing. However, these data centers, while vital, also face heightened vulnerability to hacking due to their role as the convergence points of numerous network connection nodes. Recognizing and addressing this vulnerability, particularly within the confines of green data centers, is a pressing concern. This paper proposes a novel approach to mitigate this threat by leveraging swarm intelligence techniques to detect prospective and hidden compromised devices within the data center environment. The core objective is to ensure sustainable intelligent computing through a colony strategy. The research primarily focusses on the
... Show MoreIn data mining, classification is a form of data analysis that can be used to extract models describing important data classes. Two of the well known algorithms used in data mining classification are Backpropagation Neural Network (BNN) and Naïve Bayesian (NB). This paper investigates the performance of these two classification methods using the Car Evaluation dataset. Two models were built for both algorithms and the results were compared. Our experimental results indicated that the BNN classifier yield higher accuracy as compared to the NB classifier but it is less efficient because it is time-consuming and difficult to analyze due to its black-box implementation.
It has increasingly been recognised that the future developments in geospatial data handling will centre on geospatial data on the web: Volunteered Geographic Information (VGI). The evaluation of VGI data quality, including positional and shape similarity, has become a recurrent subject in the scientific literature in the last ten years. The OpenStreetMap (OSM) project is the most popular one of the leading platforms of VGI datasets. It is an online geospatial database to produce and supply free editable geospatial datasets for a worldwide. The goal of this paper is to present a comprehensive overview of the quality assurance of OSM data. In addition, the credibility of open source geospatial data is discussed, highlight
... Show More