Skull image separation is one of the initial procedures used to detect brain abnormalities. In an MRI image of the brain, this process involves distinguishing the tissue that makes up the brain from the tissue that does not make up the brain. Even for experienced radiologists, separating the brain from the skull is a difficult task, and the accuracy of the results can vary quite a little from one individual to the next. Therefore, skull stripping in brain magnetic resonance volume has become increasingly popular due to the requirement for a dependable, accurate, and thorough method for processing brain datasets. Furthermore, skull stripping must be performed accurately for neuroimaging diagnostic systems since neither non-brain tissues nor the removal of brain sections can be addressed in the subsequent steps, resulting in an unfixed mistake during further analysis. Therefore, accurate skull stripping is necessary for neuroimaging diagnostic systems. This paper proposes a system based on deep learning and Image processing, an innovative method for converting a pre-trained model into another type of pre-trainer using pre-processing operations and the CLAHE filter as a critical phase. The global IBSR data set was used as a test and training set. For the system's efficacy, work was performed based on the principle of three dimensions and three sections of MR images and two-dimensional images, and the results were 99.9% accurate.
This work aims to analyze a three-dimensional discrete-time biological system, a prey-predator model with a constant harvesting amount. The stage structure lies in the predator species. This analysis is done by finding all possible equilibria and investigating their stability. In order to get an optimal harvesting strategy, we suppose that harvesting is to be a non-constant rate. Finally, numerical simulations are given to confirm the outcome of mathematical analysis.
Simulation experiments are a means of solving in many fields, and it is the process of designing a model of the real system in order to follow it and identify its behavior through certain models and formulas written according to a repeating software style with a number of iterations. The aim of this study is to build a model that deals with the behavior suffering from the state of (heteroskedasticity) by studying the models (APGARCH & NAGARCH) using (Gaussian) and (Non-Gaussian) distributions for different sample sizes (500,1000,1500,2000) through the stage of time series analysis (identification , estimation, diagnostic checking and prediction). The data was generated using the estimations of the parameters resulting f
... Show MoreThe Artificial Neural Network methodology is a very important & new subjects that build's the models for Analyzing, Data Evaluation, Forecasting & Controlling without depending on an old model or classic statistic method that describe the behavior of statistic phenomenon, the methodology works by simulating the data to reach a robust optimum model that represent the statistic phenomenon & we can use the model in any time & states, we used the Box-Jenkins (ARMAX) approach for comparing, in this paper depends on the received power to build a robust model for forecasting, analyzing & controlling in the sod power, the received power come from
... Show More<span>Dust is a common cause of health risks and also a cause of climate change, one of the most threatening problems to humans. In the recent decade, climate change in Iraq, typified by increased droughts and deserts, has generated numerous environmental issues. This study forecasts dust in five central Iraqi districts using machine learning and five regression algorithm supervised learning system framework. It was assessed using an Iraqi meteorological organization and seismology (IMOS) dataset. Simulation results show that the gradient boosting regressor (GBR) has a mean square error of 8.345 and a total accuracy ratio of 91.65%. Moreover, the results show that the decision tree (DT), where the mean square error is 8.965, c
... Show MoreIt takes a lot of time to classify the banana slices by sweetness level using traditional methods. By assessing the quality of fruits more focus is placed on its sweetness as well as the color since they affect the taste. The reason for sorting banana slices by their sweetness is to estimate the ripeness of bananas using the sweetness and color values of the slices. This classifying system assists in establishing the degree of ripeness of bananas needed for processing and consumption. The purpose of this article is to compare the efficiency of the SVM-linear, SVM-polynomial, and LDA classification of the sweetness of banana slices by their LRV level. The result of the experiment showed that the highest accuracy of 96.66% was achieved by the
... Show MoreThe advancement of cement alternatives in the construction materials industry is fundamental to sustainable development. Geopolymer is the optimal substitute for ordinary Portland cement, which produces 80% less CO2 emissions than ordinary Portland cement. Metakaolin was used as one of the raw materials in the geopolymerization process. This research examines the influence of three different percentages of sulfate (0.00038, 1.532, and 16.24) % in sand per molarity of NaOH on the compressive strength of metakaolin-based geopolymer mortar (MK-GPM). Samples were prepared with two different molarities (8M and 12M) and cured at room temperature. The best compressive strength value (56.98MPa) was recorded with 12M w
... Show MoreA substantial matter to confidential messages' interchange through the internet is transmission of information safely. For example, digital products' consumers and producers are keen for knowing those products are genuine and must be distinguished from worthless products. Encryption's science can be defined as the technique to embed the data in an images file, audio or videos in a style which should be met the safety requirements. Steganography is a portion of data concealment science that aiming to be reached a coveted security scale in the interchange of private not clear commercial and military data. This research offers a novel technique for steganography based on hiding data inside the clusters that resulted from fuzzy clustering. T
... Show MoreGrabisch and Labreuche have recently proposed a generalization of capacities, called the bi-capacities. Recently, a new approach for studying bi-capacities through introducing a notion of ternary-element sets proposed by the author. In this paper, we propose many results such as bipolar Mobius transform, importance index, and interaction index of bi-capacities based on our approach.
This study came to discuss the subject of industries dependent on petrochemical industries in Iraq (plastic as a model) during the period 2005–2020, and the study concluded that the plastic industries contribute to areas of advancement and progress and opportunities to deal efficiently with the challenges posed by the new variables, the most important of which is the information revolution. communications and trade liberalization, and this is what contributes to the competitiveness of these industries. And because the petrochemical industry in Iraq has an active role in establishing plastic industrial clusters and clusters of micro, small, and medium industries by providing the necessary feedstock for these industries in various fields
... Show More