Support vector machine (SVM) is a popular supervised learning algorithm based on margin maximization. It has a high training cost and does not scale well to a large number of data points. We propose a multiresolution algorithm MRH-SVM that trains SVM on a hierarchical data aggregation structure, which also serves as a common data input to other learning algorithms. The proposed algorithm learns SVM models using high-level data aggregates and only visits data aggregates at more detailed levels where support vectors reside. In addition to performance improvements, the algorithm has advantages such as the ability to handle data streams and datasets with imbalanced classes. Experimental results show significant performance improvements in comparison with existing SVM algorithms.
The graphic privacy feature is one of the most important specifications for the existence of any type of design achievements alike, which is one of the graphic products with its multiple data, and from here the current research investigates the graphic privacy of vector graphics design with all its technical descriptions and concepts associated with it and the possibility of achieving it to the best that it should be from Where its formal structure in children's publications, where the structural structure of the current research came from the first chapter, which contained the research problem, which came according to the following question: What is the graphic privacy in the design of vector graphics in children's publ
... Show More<span lang="EN-US">This paper presents the comparison between optimized unscented Kalman filter (UKF) and optimized extended Kalman filter (EKF) for sensorless direct field orientation control induction motor (DFOCIM) drive. The high performance of UKF and EKF depends on the accurate selection of state and noise covariance matrices. For this goal, multi objective function genetic algorithm is used to find the optimal values of state and noise covariance matrices. The main objectives of genetic algorithm to be minimized are the mean square errors (MSE) between actual and estimation of speed, current, and flux. Simulation results show the optimal state and noise covariance matrices can improve the estimation of speed, current, t
... Show MoreImage compression is a serious issue in computer storage and transmission, that simply makes efficient use of redundancy embedded within an image itself; in addition, it may exploit human vision or perception limitations to reduce the imperceivable information Polynomial coding is a modern image compression technique based on modelling concept to remove the spatial redundancy embedded within the image effectively that composed of two parts, the mathematical model and the residual. In this paper, two stages proposed technqies adopted, that starts by utilizing the lossy predictor model along with multiresolution base and thresholding techniques corresponding to first stage. Latter by incorporating the near lossless com
... Show MoreGeneral Directorate of Surveying is considered one of the most important sources of maps in Iraq. It produced digital maps for whole Iraq in the last six years. These maps are produced from different data sources with unknown accuracy; therefore, the quality of these maps needs to be assessed. The main aim of this study is to evaluate the positional accuracy of digital maps that produced from General Directorate of Surveying. Two different study areas were selected: AL-Rusafa and AL-Karkh in Baghdad / Iraq with an area of 172.826 and 135.106 square kilometers, respectively. Different statistical analyses were conducted to calculate the elements of positional accuracy assessment (mean µ, root mean square error RMSE, minimum and maxi
... Show MoreGeneral Directorate of Surveying is considered one of the most important sources of maps in Iraq. It produced digital maps for whole Iraq in the last six years. These maps are produced from different data sources with unknown accuracy; therefore, the quality of these maps needs to be assessed. The main aim of this study is to evaluate the positional accuracy of digital maps that produced from General Directorate of Surveying. Two different study areas were selected: AL-Rusafa and AL-Karkh in Baghdad / Iraq with an area of 172.826 and 135.106 square kilometers, respectively. Different statistical analyses were conducted to calculate the elements of positional accuracy assessment (mean µ, root mean square error RMSE, mini
... Show MoreClassification of imbalanced data is an important issue. Many algorithms have been developed for classification, such as Back Propagation (BP) neural networks, decision tree, Bayesian networks etc., and have been used repeatedly in many fields. These algorithms speak of the problem of imbalanced data, where there are situations that belong to more classes than others. Imbalanced data result in poor performance and bias to a class without other classes. In this paper, we proposed three techniques based on the Over-Sampling (O.S.) technique for processing imbalanced dataset and redistributing it and converting it into balanced dataset. These techniques are (Improved Synthetic Minority Over-Sampling Technique (Improved SMOTE), Border
... Show MoreComputer-aided diagnosis (CAD) has proved to be an effective and accurate method for diagnostic prediction over the years. This article focuses on the development of an automated CAD system with the intent to perform diagnosis as accurately as possible. Deep learning methods have been able to produce impressive results on medical image datasets. This study employs deep learning methods in conjunction with meta-heuristic algorithms and supervised machine-learning algorithms to perform an accurate diagnosis. Pre-trained convolutional neural networks (CNNs) or auto-encoder are used for feature extraction, whereas feature selection is performed using an ant colony optimization (ACO) algorithm. Ant colony optimization helps to search for the bes
... Show MoreThe goal of this research is to introduce the concepts of Large-small submodule and Large-hollow module and some properties of them are considered, such that a proper submodule N of an R-module M is said to be Large-small submodule, if N + K = M where K be a submodule of M, then K is essential submodule of M ( K ≤e M ). An R-module M is called Large-hollow module if every proper submodule of M is Large-small submodule in M.
Big data analysis is essential for modern applications in areas such as healthcare, assistive technology, intelligent transportation, environment and climate monitoring. Traditional algorithms in data mining and machine learning do not scale well with data size. Mining and learning from big data need time and memory efficient techniques, albeit the cost of possible loss in accuracy. We have developed a data aggregation structure to summarize data with large number of instances and data generated from multiple data sources. Data are aggregated at multiple resolutions and resolution provides a trade-off between efficiency and accuracy. The structure is built once, updated incrementally, and serves as a common data input for multiple mining an
... Show MoreBusiness incubator is a new effective mechanism in developing small projects through its introductions of a new integral system of services. It aims at supporting and developing making new projects. Hence, there is a big number of factors that are interrelated in the processes of preparation for those projects. Those factors are: organizing the incubator and the market available for the projects attached to them and the work programs which will have to be implemented. Those small projects represent more than 98% of the total work institutions in the world. Also it has become responsible for a ration reaching half of the national output of those countries. These projects have created between 40 to 80% of job opportunities availabl
... Show More