Today, the science of artificial intelligence has become one of the most important sciences in creating intelligent computer programs that simulate the human mind. The goal of artificial intelligence in the medical field is to assist doctors and health care workers in diagnosing diseases and clinical treatment, reducing the rate of medical error, and saving lives of citizens. The main and widely used technologies are expert systems, machine learning and big data. In the article, a brief overview of the three mentioned techniques will be provided to make it easier for readers to understand these techniques and their importance.
The development of information systems in recent years has contributed to various methods of gathering information to evaluate IS performance. The most common approach used to collect information is called the survey system. This method, however, suffers one major drawback. The decision makers consume considerable time to transform data from survey sheets to analytical programs. As such, this paper proposes a method called ‘survey algorithm based on R programming language’ or SABR, for data transformation from the survey sheets inside R environments by treating the arrangement of data as a relational format. R and Relational data format provide excellent opportunity to manage and analyse the accumulated data. Moreover, a survey syste
... Show MoreThe convolutional neural networks (CNN) are among the most utilized neural networks in various applications, including deep learning. In recent years, the continuing extension of CNN into increasingly complicated domains has made its training process more difficult. Thus, researchers adopted optimized hybrid algorithms to address this problem. In this work, a novel chaotic black hole algorithm-based approach was created for the training of CNN to optimize its performance via avoidance of entrapment in the local minima. The logistic chaotic map was used to initialize the population instead of using the uniform distribution. The proposed training algorithm was developed based on a specific benchmark problem for optical character recog
... Show MoreBackground: Optimal root canal retreatment was required safe and efficient removal of filling material from root canal. The aim of this in vitro study was to compare the efficacy of reciprocating and continuous motion of four retreatment systems in removal of root canal filling material. Materials and Methods: Forty distal roots of the mandibular first molars teeth were used in this study, these roots were embedded in cold clear acrylic,roots were instrumented using crown down technique and rotary ProTaper systemize Sx to size F2 ,instrumentation were done with copiousirrigation of 2.5% sodium hypochlorite and 17% buffered solution of EDTA was used as final irrigant followed by distilledwater, roots were obturated with AH26 sealer and Prota
... Show MoreThe drones have become the focus of researchers’ attention because they enter into many details of life. The Tri-copter was chosen because it combines the advantages of the quadcopter in stability and manoeuvrability quickly. In this paper, the nonlinear Tri-copter model is entirely derived and applied three controllers; Proportional-Integral-Derivative (PID), Fractional Order PID (FOPID), and Nonlinear PID (NLPID). The tuning process for the controllers’ parameters had been tuned by using the Grey Wolf Optimization (GWO) algorithm. Then the results obtained had been compared. Where the improvement rate for the Tri-copter model of the nonlinear controller (NLPID) if compared with
In this research، a comparison has been made between the robust estimators of (M) for the Cubic Smoothing Splines technique، to avoid the problem of abnormality in data or contamination of error، and the traditional estimation method of Cubic Smoothing Splines technique by using two criteria of differentiation which are (MADE، WASE) for different sample sizes and disparity levels to estimate the chronologically different coefficients functions for the balanced longitudinal data which are characterized by observations obtained through (n) from the independent subjects، each one of them is measured repeatedly by group of specific time points (m)،since the frequent measurements within the subjects are almost connected an
... Show MoreThis research aims to study the methods of reduction of dimensions that overcome the problem curse of dimensionality when traditional methods fail to provide a good estimation of the parameters So this problem must be dealt with directly . Two methods were used to solve the problem of high dimensional data, The first method is the non-classical method Slice inverse regression ( SIR ) method and the proposed weight standard Sir (WSIR) method and principal components (PCA) which is the general method used in reducing dimensions, (SIR ) and (PCA) is based on the work of linear combinations of a subset of the original explanatory variables, which may suffer from the problem of heterogeneity and the problem of linear
... Show MoreCanonical correlation analysis is one of the common methods for analyzing data and know the relationship between two sets of variables under study, as it depends on the process of analyzing the variance matrix or the correlation matrix. Researchers resort to the use of many methods to estimate canonical correlation (CC); some are biased for outliers, and others are resistant to those values; in addition, there are standards that check the efficiency of estimation methods.
In our research, we dealt with robust estimation methods that depend on the correlation matrix in the analysis process to obtain a robust canonical correlation coefficient, which is the method of Biwe
... Show MoreSome experiments need to know the extent of their usefulness to continue providing them or not. This is done through the fuzzy regression discontinuous model, where the Epanechnikov Kernel and Triangular Kernel were used to estimate the model by generating data from the Monte Carlo experiment and comparing the results obtained. It was found that the. Epanechnikov Kernel has a least mean squared error.
The subject of an valuation of quality of construction projects is one of the topics which it becomes necessary of the absence of the quantity standards in measuring the control works and the quality valuation standards in constructional projects. In the time being it depends on the experience of the workers which leads to an apparent differences in the valuation.
The idea of this research came to put the standards to evaluate the quality of the projects in a special system depending on quantity scale nor quality specifying in order to prepare an expert system “ Crystal “ to apply this special system to able the engineers to valuate the quality of their projects easily and in more accurate ways.
COVID-19 is a disease that has abnormal over 170 nations worldwide. The number of infected people (either sick or dead) has been growing at a worrying ratio in virtually all the affected countries. Forecasting procedures can be instructed so helping in scheming well plans and in captivating creative conclusions. These procedures measure the conditions of the previous thus allowing well forecasts around the state to arise in the future. These predictions strength helps to make contradiction of likely pressures and significances. Forecasting procedures production a very main character in elastic precise predictions. In this case study used two models in order to diagnose optimal approach by compared the outputs. This study was introduce
... Show More