In this paper, we used four classification methods to classify objects and compareamong these methods, these are K Nearest Neighbor's (KNN), Stochastic Gradient Descentlearning (SGD), Logistic Regression Algorithm(LR), and Multi-Layer Perceptron (MLP). Weused MCOCO dataset for classification and detection the objects, these dataset image wererandomly divided into training and testing datasets at a ratio of 7:3, respectively. In randomlyselect training and testing dataset images, converted the color images to the gray level, thenenhancement these gray images using the histogram equalization method, resize (20 x 20) fordataset image. Principal component analysis (PCA) was used for feature extraction, andfinally apply four classification methods, the results indicate that MLP was better than otherswith precision 81% , it took the maximum execution time for processing of the data-sets.
Skull image separation is one of the initial procedures used to detect brain abnormalities. In an MRI image of the brain, this process involves distinguishing the tissue that makes up the brain from the tissue that does not make up the brain. Even for experienced radiologists, separating the brain from the skull is a difficult task, and the accuracy of the results can vary quite a little from one individual to the next. Therefore, skull stripping in brain magnetic resonance volume has become increasingly popular due to the requirement for a dependable, accurate, and thorough method for processing brain datasets. Furthermore, skull stripping must be performed accurately for neuroimaging diagnostic systems since neither no
... Show MoreRegarding to the computer system security, the intrusion detection systems are fundamental components for discriminating attacks at the early stage. They monitor and analyze network traffics, looking for abnormal behaviors or attack signatures to detect intrusions in early time. However, many challenges arise while developing flexible and efficient network intrusion detection system (NIDS) for unforeseen attacks with high detection rate. In this paper, deep neural network (DNN) approach was proposed for anomaly detection NIDS. Dropout is the regularized technique used with DNN model to reduce the overfitting. The experimental results applied on NSL_KDD dataset. SoftMax output layer has been used with cross entropy loss funct
... Show MoreText Clustering consists of grouping objects of similar categories. The initial centroids influence operation of the system with the potential to become trapped in local optima. The second issue pertains to the impact of a huge number of features on the determination of optimal initial centroids. The problem of dimensionality may be reduced by feature selection. Therefore, Wind Driven Optimization (WDO) was employed as Feature Selection to reduce the unimportant words from the text. In addition, the current study has integrated a novel clustering optimization technique called the WDO (Wasp Swarm Optimization) to effectively determine the most suitable initial centroids. The result showed the new meta-heuristic which is WDO was employed as t
... Show MoreA mixture model is used to model data that come from more than one component. In recent years, it became an effective tool in drawing inferences about the complex data that we might come across in real life. Moreover, it can represent a tremendous confirmatory tool in classification observations based on similarities amongst them. In this paper, several mixture regression-based methods were conducted under the assumption that the data come from a finite number of components. A comparison of these methods has been made according to their results in estimating component parameters. Also, observation membership has been inferred and assessed for these methods. The results showed that the flexible mixture model outperformed the others
... Show MoreA mixture model is used to model data that come from more than one component. In recent years, it became an effective tool in drawing inferences about the complex data that we might come across in real life. Moreover, it can represent a tremendous confirmatory tool in classification observations based on similarities amongst them. In this paper, several mixture regression-based methods were conducted under the assumption that the data come from a finite number of components. A comparison of these methods has been made according to their results in estimating component parameters. Also, observation membership has been inferred and assessed for these methods. The results showed that the flexible mixture model outperformed the
... Show MoreCytokines are signaling molecules between inflammatory cells that play a significant role in the pathogenesis of a disease. Among these cytokines are interleukins (ILs) 17A and 33, and accordingly, the current case-control study sought to investigate the role of each of the two cytokines in the risk of developing multiple sclerosis (MS). Sixty-eight relapsing-remitting MS (RRMS) Iraqi patients and twenty healthy individuals (control group) were enrolled. Enzyme linked immunosorbent assay (ELISA) kits were used to determine serum levels of IL-17A and IL-33. Results revealed that IL-17A and IL-33 levels were significantly higher in MS patients than in controls (14.1 ± 4.5 vs. 7.5 ± 3.8 pg/mL; p < 0.001 and 65.3 ± 16
... Show MoreA Strength Pareto Evolutionary Algorithm 2 (SPEA 2) approach for solving the multi-objective Environmental / Economic Power Dispatch (EEPD) problem is presented in this paper. In the past fuel cost consumption minimization was the aim (a single objective function) of economic power dispatch problem. Since the clean air act amendments have been applied to reduce SO2 and NOX emissions from power plants, the utilities change their strategies in order to reduce pollution and atmospheric emission as well, adding emission minimization as other objective function made economic power dispatch (EPD) a multi-objective problem having conflicting objectives. SPEA2 is the improved version of SPEA with better fitness assignment, density estimation, an
... Show MoreCloud computing is the new technological trend for future generations. It represents a new way to use IT resources more efficiently. Cloud computing is one of the most technological models for developing and exploiting infrastructure resources in the world. Under the cloud, the user no longer needs to look for major financing to purchase infrastructure equipment as companies, especially small and medium-sized ones, can get the equipment as a service, rather than buying it as a product. The idea of cloud computing dates back to the sixties of the last century, but this idea did not come into actual application until the beginning of the third millennium, at the hands of technology companies such as Apple, Hp, IBM, which had
... Show MoreComputational Thinking (CT) is very useful in the process of solving everyday problems for undergraduates. In terms of content, computational thinking involves solving problems, studying data patterns, deconstructing problems using algorithms and procedures, doing simulations, computer modeling, and reasoning about abstract things. However, there is a lack of studies dealing with it and its skills that can be developed and utilized in the field of information and technology used in learning and teaching. The descriptive research method was used, and a test research tool was prepared to measure the level of (CT) consisting of (24) items of the type of multiple-choice to measure the level of "CT". The research study group consists of
... Show MoreRecommendation systems are now being used to address the problem of excess information in several sectors such as entertainment, social networking, and e-commerce. Although conventional methods to recommendation systems have achieved significant success in providing item suggestions, they still face many challenges, including the cold start problem and data sparsity. Numerous recommendation models have been created in order to address these difficulties. Nevertheless, including user or item-specific information has the potential to enhance the performance of recommendations. The ConvFM model is a novel convolutional neural network architecture that combines the capabilities of deep learning for feature extraction with the effectiveness o
... Show More