Today, the science of artificial intelligence has become one of the most important sciences in creating intelligent computer programs that simulate the human mind. The goal of artificial intelligence in the medical field is to assist doctors and health care workers in diagnosing diseases and clinical treatment, reducing the rate of medical error, and saving lives of citizens. The main and widely used technologies are expert systems, machine learning and big data. In the article, a brief overview of the three mentioned techniques will be provided to make it easier for readers to understand these techniques and their importance.
Abstract
The study seeks to use one of the techniques (Data mining) a (Logic regression) on the inherited risk through the use of style financial ratios technical analysis and then apply for financial fraud indicators,Since higher scandals exposed companies and the failure of the audit process has shocked the community and affected the integrity of the auditor and the reason is financial fraud practiced by the companies and not to the discovery of the fraud by the auditor, and this fraud involves intentional act aimed to achieve personal and harm the interests of to others, and doing (administration, staff) we can say that all frauds carried out through the presence of the motives and factors that help th
... Show MoreAmong the metaheuristic algorithms, population-based algorithms are an explorative search algorithm superior to the local search algorithm in terms of exploring the search space to find globally optimal solutions. However, the primary downside of such algorithms is their low exploitative capability, which prevents the expansion of the search space neighborhood for more optimal solutions. The firefly algorithm (FA) is a population-based algorithm that has been widely used in clustering problems. However, FA is limited in terms of its premature convergence when no neighborhood search strategies are employed to improve the quality of clustering solutions in the neighborhood region and exploring the global regions in the search space. On the
... Show MoreSupport vector machines (SVMs) are supervised learning models that analyze data for classification or regression. For classification, SVM is widely used by selecting an optimal hyperplane that separates two classes. SVM has very good accuracy and extremally robust comparing with some other classification methods such as logistics linear regression, random forest, k-nearest neighbor and naïve model. However, working with large datasets can cause many problems such as time-consuming and inefficient results. In this paper, the SVM has been modified by using a stochastic Gradient descent process. The modified method, stochastic gradient descent SVM (SGD-SVM), checked by using two simulation datasets. Since the classification of different ca
... Show MoreDue to restrictions and limitations on agricultural water worldwide, one of the most effective ways to conserve water in this sector is to reduce the water losses and improve irrigation uniformity. Nowadays, the low-pressure sprinkler has been widely used to replace the high-pressure impact sprinklers in lateral move sprinkler irrigation systems due to its low operating cost and high efficiency. However, the hazard of surface runoff represents the biggest obstacle for low-pressure sprinkler systems. Most researchers have used the pulsing technique to apply variable-rate irrigation to match the crop water needs within a normal application rate that does not produce runoff. This research introduces a variable pulsed irrigation algorit
... Show MoreIn this study, composite materials consisting of Activated Carbon (AC) and Zeolite were prepared for application in the removal of methylene blue and lead from an aqueous solution. The optimum synthesis method involves the use of metakaolinization and zeolitization, in the presence of activated carbon from kaolin, to form Zeolite. First, Kaolin was thermally activated into amorphous kaolin (metakaolinization); then the resultant metakaolin was attacked by alkaline, transforming it into crystalline zeolite (zeolitization). Using nitrogen adsorption and SEM techniques, the examination and characterization of composite materials confirmed the presence of a homogenous distribution of Zeolite throughout the activated carbon.
... Show MoreHartha Formation is an overburdened horizon in the X-oilfield which generates a lot of Non-Productive Time (NPT) associated with drilling mud losses. This study has been conducted to investigate the loss events in this formation as well as to provide geological interpretations based on datasets from nine wells in this field of interest. The interpretation was based on different analyses including wireline logs, cuttings descriptions, image logs, and analog data. Seismic and coherency data were also used to formulate the geological interpretations and calibrate that with the loss events of the Hartha Fm.
The results revealed that the upper part of the Hartha Fm. was identified as an interval capable of creating potentia
... Show MoreOne of the most significant elements influencing weather, climate, and the environment is vegetation cover. Normalized Difference Vegetation Index (NDVI) and Normalized Difference Built-up Index (NDBI) over the years 2019–2022 were estimated based on four Landsat 8 TIRS’s images covering Duhok City. Using the radiative transfer model, the city's land surface temperature (LST) during the next four years was calculated. The aim of this study is to compute the temperature at the land's surface (LST) from the years 2019-2022 and understand the link, between LST, NDVI, and NDBI and the capability for mapping by LANDSAT-8 TIRS's. The findings revealed that the NDBI and the NDVI had the strongest correlation with the
... Show MoreAbstract
The research Compared two methods for estimating fourparametersof the compound exponential Weibull - Poisson distribution which are the maximum likelihood method and the Downhill Simplex algorithm. Depending on two data cases, the first one assumed the original data (Non-polluting), while the second one assumeddata contamination. Simulation experimentswere conducted for different sample sizes and initial values of parameters and under different levels of contamination. Downhill Simplex algorithm was found to be the best method for in the estimation of the parameters, the probability function and the reliability function of the compound distribution in cases of natural and contaminateddata.
... Show More