Data generated from modern applications and the internet in healthcare is extensive and rapidly expanding. Therefore, one of the significant success factors for any application is understanding and extracting meaningful information using digital analytics tools. These tools will positively impact the application's performance and handle the challenges that can be faced to create highly consistent, logical, and information-rich summaries. This paper contains three main objectives: First, it provides several analytics methodologies that help to analyze datasets and extract useful information from them as preprocessing steps in any classification model to determine the dataset characteristics. Also, this paper provides a comparative study of several classification algorithms by testing 12 different classifiers using two international datasets to provide an accurate indicator of their efficiency and the future possibility of combining efficient algorithms to achieve better results. Finally, building several CBC datasets for the first time in Iraq helps to detect blood diseases from different hospitals. The outcome of the analysis step is used to help researchers to select the best system structure according to the characteristics of each dataset for more organized and thorough results. Also, according to the test results, four algorithms achieved the best accuracy (Logitboost, Random Forest, XGBoost, Multilayer Perceptron). Then use the Logitboost algorithm that achieved the best accuracy to classify these new datasets. In addition, as future directions, this paper helps to investigate the possibility of combining the algorithms to utilize benefits and overcome their disadvantages.
The Accounting Disclosure for non-current intangible assets is necessary to rely on accounting information by decision makers in the economic unity, two international accounting standards issued (IAS16,36), which aims to provide the foundations of the recognition, measurement and disclosure of appropriate assets Non-current tangible. (IAS16) allowed to use re-evaluation approach to measure assets entrance due to the inadequacy of the accounting information resulting from the application of the historical cost of the entrance under increasing technical developments and continuing that leave clear their effects on non-current intangible assets, As well as the requirements of what came (IAS36) the importance of accounting for the impairment
... Show MoreImportance of accounting standards belong to be the instructor and the advisor for accountant in performing his work . For each invironment a group of political, social, economical and cultural factors which distinguish it about other environments . In order to perform its aim in produsing accouting information helps in making decisions on different levels, accounting standards should established in a form that harmonized with the environment that apply in it . Establishing international accounting standards comes with the same direction and then it has put influential with standards some states that have influence on international accounting standards committee. So because of the big changes that happened in the inte
... Show MoreThe study was carried out in order to evaluate clinically and laboratory cachectic animals suffering from anemia. Animal examined were 50 cow and calf. The study include clinical, hemato and biochemical test for accurate diagnosis of cachexia in cows and calves . Blood smears were conducted for detection of blood parasites , fecal examination for gastrointestinal parasites and Different parameters were applied for classification of cachexia , depending on bony projection specially ribs and pelvic and generalized muscular atrophy. However , The study revealed an incidence of cachexia and anemia of blood parasites was including Theileria, Anaplasma, gastrointestinal parasites, ten cases were shown foreign body syndrome while other tens wer
... Show MoreThe method binery logistic regression and linear discrimint function of the most important statistical methods used in the classification and prediction when the data of the kind of binery (0,1) you can not use the normal regression therefore resort to binary logistic regression and linear discriminant function in the case of two group in the case of a Multicollinearity problem between the data (the data containing high correlation) It became not possible to use binary logistic regression and linear discriminant function, to solve this problem, we resort to Partial least square regression.
In this, search the comparison between binary lo
... Show MoreAbstract:
The great expansion of teaching skills requires finding ways and methods to help teachers acquire experiences of all kinds. The researcher found in the subject of the teaching skills for teachers in public and private schools a fertile field for conducting a study that enables the measurement of these skills. Thus, the study aims to identify the skills of teaching lessons for teachers, the difference in teaching lesson skills for teachers according to the years of service, the differences in teaching lesson skills for teachers according to the specialized teachers and non-specialized teachers, the differences in teaching lesson skills for teachers according to the public and private school. The
... Show MoreIn regression testing, Test case prioritization (TCP) is a technique to arrange all the available test cases. TCP techniques can improve fault detection performance which is measured by the average percentage of fault detection (APFD). History-based TCP is one of the TCP techniques that consider the history of past data to prioritize test cases. The issue of equal priority allocation to test cases is a common problem for most TCP techniques. However, this problem has not been explored in history-based TCP techniques. To solve this problem in regression testing, most of the researchers resort to random sorting of test cases. This study aims to investigate equal priority in history-based TCP techniques. The first objective is to implement
... Show MoreThe expanding use of multi-processor supercomputers has made a significant impact on the speed and size of many problems. The adaptation of standard Message Passing Interface protocol (MPI) has enabled programmers to write portable and efficient codes across a wide variety of parallel architectures. Sorting is one of the most common operations performed by a computer. Because sorted data are easier to manipulate than randomly ordered data, many algorithms require sorted data. Sorting is of additional importance to parallel computing because of its close relation to the task of routing data among processes, which is an essential part of many parallel algorithms. In this paper, sequential sorting algorithms, the parallel implementation of man
... Show MoreGeneral Directorate of Surveying is considered one of the most important sources of maps in Iraq. It produced digital maps for whole Iraq in the last six years. These maps are produced from different data sources with unknown accuracy; therefore, the quality of these maps needs to be assessed. The main aim of this study is to evaluate the positional accuracy of digital maps that produced from General Directorate of Surveying. Two different study areas were selected: AL-Rusafa and AL-Karkh in Baghdad / Iraq with an area of 172.826 and 135.106 square kilometers, respectively. Different statistical analyses were conducted to calculate the elements of positional accuracy assessment (mean µ, root mean square error RMSE, minimum and maxi
... Show MoreGeneral Directorate of Surveying is considered one of the most important sources of maps in Iraq. It produced digital maps for whole Iraq in the last six years. These maps are produced from different data sources with unknown accuracy; therefore, the quality of these maps needs to be assessed. The main aim of this study is to evaluate the positional accuracy of digital maps that produced from General Directorate of Surveying. Two different study areas were selected: AL-Rusafa and AL-Karkh in Baghdad / Iraq with an area of 172.826 and 135.106 square kilometers, respectively. Different statistical analyses were conducted to calculate the elements of positional accuracy assessment (mean µ, root mean square error RMSE, mini
... Show MoreClassification of imbalanced data is an important issue. Many algorithms have been developed for classification, such as Back Propagation (BP) neural networks, decision tree, Bayesian networks etc., and have been used repeatedly in many fields. These algorithms speak of the problem of imbalanced data, where there are situations that belong to more classes than others. Imbalanced data result in poor performance and bias to a class without other classes. In this paper, we proposed three techniques based on the Over-Sampling (O.S.) technique for processing imbalanced dataset and redistributing it and converting it into balanced dataset. These techniques are (Improved Synthetic Minority Over-Sampling Technique (Improved SMOTE), Border
... Show More