CNC machines are widely used in production fields since they produce similar parts in a minimum time, at higher speed and with possibly minimum error. A control system is designed, implemented and tested to control the operation of a laboratory CNC milling machine having three axes that are moved by using a stepper motor attached to each axis. The control system includes two parts, hardware part and software part, the hardware part used a PC (works as controller) connected to the CNC machine through its parallel port by using designed interface circuit. The software part includes the algorithms needed to control the CNC. The sample needs to be machined is drawn by using one of the drawing software like AUTOCAD or 3D MAX and is saved in a well-known file format (DXF), then that file is fed to the CNC machine controller by the CNC operator, so that part will be machined by the CNC machine. The CNC controller using developed algorithms that reads the DXF file feeds to the machine, extracts (line, circle or arc) shapes from the file and generates commands to move the CNC machine axes so that these shapes can be machined.
Predicting the network traffic of web pages is one of the areas that has increased focus in recent years. Modeling traffic helps find strategies for distributing network loads, identifying user behaviors and malicious traffic, and predicting future trends. Many statistical and intelligent methods have been studied to predict web traffic using time series of network traffic. In this paper, the use of machine learning algorithms to model Wikipedia traffic using Google's time series dataset is studied. Two data sets were used for time series, data generalization, building a set of machine learning models (XGboost, Logistic Regression, Linear Regression, and Random Forest), and comparing the performance of the models using (SMAPE) and
... Show MoreIn this paper, a handwritten digit classification system is proposed based on the Discrete Wavelet Transform and Spike Neural Network. The system consists of three stages. The first stage is for preprocessing the data and the second stage is for feature extraction, which is based on Discrete Wavelet Transform (DWT). The third stage is for classification and is based on a Spiking Neural Network (SNN). To evaluate the system, two standard databases are used: the MADBase database and the MNIST database. The proposed system achieved a high classification accuracy rate with 99.1% for the MADBase database and 99.9% for the MNIST database
Skin detection is classification the pixels of the image into two types of pixels skin and non-skin. Whereas, skin color affected by many issues like various races of people, various ages of people gender type. Some previous researchers attempted to solve these issues by applying a threshold that depends on certain ranges of skin colors. Despite, it is fast and simple implementation, it does not give a high detection for distinguishing all colors of the skin of people. In this paper suggests improved ID3 (Iterative Dichotomiser) to enhance the performance of skin detection. Three color spaces have been used a dataset of RGB obtained from machine learning repository, the University of California Irvine (UCI), RGB color space, HSV color sp
... Show MoreHeart disease is a significant and impactful health condition that ranks as the leading cause of death in many countries. In order to aid physicians in diagnosing cardiovascular diseases, clinical datasets are available for reference. However, with the rise of big data and medical datasets, it has become increasingly challenging for medical practitioners to accurately predict heart disease due to the abundance of unrelated and redundant features that hinder computational complexity and accuracy. As such, this study aims to identify the most discriminative features within high-dimensional datasets while minimizing complexity and improving accuracy through an Extra Tree feature selection based technique. The work study assesses the efficac
... Show MoreWisconsin Breast Cancer Dataset (WBCD) was employed to show the performance of the Adaptive Resonance Theory (ART), specifically the supervised ART-I Artificial Neural Network (ANN), to build a breast cancer diagnosis smart system. It was fed with different learning parameters and sets. The best result was achieved when the model was trained with 50% of the data and tested with the remaining 50%. Classification accuracy was compared to other artificial intelligence algorithms, which included fuzzy classifier, MLP-ANN, and SVM. We achieved the highest accuracy with such low learning/testing ratio.
Among the metaheuristic algorithms, population-based algorithms are an explorative search algorithm superior to the local search algorithm in terms of exploring the search space to find globally optimal solutions. However, the primary downside of such algorithms is their low exploitative capability, which prevents the expansion of the search space neighborhood for more optimal solutions. The firefly algorithm (FA) is a population-based algorithm that has been widely used in clustering problems. However, FA is limited in terms of its premature convergence when no neighborhood search strategies are employed to improve the quality of clustering solutions in the neighborhood region and exploring the global regions in the search space. On the
... Show MoreIn this study, the relationship between the bare soil temperature with respect to its salinity is presented, the bare soil feature is considered only by eliminating all other land features by classifying the site location by using the support vector machine algorithm, in the same time the salinity index that calculated from the spectral response from the satellite bands is calibrated using empirical salinity value calculated from field soil samples. A 2D probability density function is used to analyze the relationship between the temperature rising from the minimum temperature (from the sunrise time) due to the solar radiation duration tell the time of the satellite capturing the scene image and the calibrated salinity index is presented. T
... Show MoreThe rehabilitation of deteriorated pavements using Asphalt Concrete (AC) overlays consistently confronts the reflection cracking challenge, where inherent cracks and joints from an existing pavement layer are mirrored in the new overlay. To address this issue, the current study evaluates the effectiveness of Engineered Cementitious Composite (ECC) and geotextile fabric as mitigation strategies. ECC, characterized by its tensile ductility, fracture resistance, and high deformation capacity, was examined in interlayer thicknesses of 7, 12, and 17 mm. Additionally, the impact of geotextile fabric positioning at the base and at 1/3 depth of the AC specimen was explored. Utilizing the Overlay Testing Machine (OTM) for evaluations, the research d
... Show MoreThe brain's magnetic resonance imaging (MRI) is tasked with finding the pixels or voxels that establish where the brain is in a medical image The Convolutional Neural Network (CNN) can process curved baselines that frequently occur in scanned documents. Next, the lines are separated into characters. In the Convolutional Neural Network (CNN) can process curved baselines that frequently occur in scanned documents case of fonts with a fixed MRI width, the gaps are analyzed and split. Otherwise, a limited region above the baseline is analyzed, separated, and classified. The words with the lowest recognition score are split into further characters x until the result improves. If this does not improve the recognition s
... Show MoreIn this paper, the bi-criteria machine scheduling problems (BMSP) are solved, where the discussed problem is represented by the sum of completion and the sum of late work times simultaneously. In order to solve the suggested BMSP, some metaheurisitc methods are suggested which produce good results. The suggested local search methods are simulated annulling and bees algorithm. The results of the new metaheurisitc methods are compared with the complete enumeration method, which is considered an exact method, then compared results of the heuristics with each other to obtain the most efficient method.