Plane cubics curves may be classified up to isomorphism or projective equivalence. In this paper, the inequivalent elliptic cubic curves which are non-singular plane cubic curves have been classified projectively over the finite field of order nineteen, and determined if they are complete or incomplete as arcs of degree three. Also, the maximum size of a complete elliptic curve that can be constructed from each incomplete elliptic curve are given.
The support vector machine, also known as SVM, is a type of supervised learning model that can be used for classification or regression depending on the datasets. SVM is used to classify data points by determining the best hyperplane between two or more groups. Working with enormous datasets, on the other hand, might result in a variety of issues, including inefficient accuracy and time-consuming. SVM was updated in this research by applying some non-linear kernel transformations, which are: linear, polynomial, radial basis, and multi-layer kernels. The non-linear SVM classification model was illustrated and summarized in an algorithm using kernel tricks. The proposed method was examined using three simulation datasets with different sample
... Show MoreAstronomy image is regarded main source of information to discover outer space, therefore to know the basic contain for galaxy (Milky way), it was classified using Variable Precision Rough Sets technique to determine the different region within galaxy according different color in the image. From classified image we can determined the percentage for each class and then what is the percentage mean. In this technique a good classified image result and faster time required to done the classification process.
Data generated from modern applications and the internet in healthcare is extensive and rapidly expanding. Therefore, one of the significant success factors for any application is understanding and extracting meaningful information using digital analytics tools. These tools will positively impact the application's performance and handle the challenges that can be faced to create highly consistent, logical, and information-rich summaries. This paper contains three main objectives: First, it provides several analytics methodologies that help to analyze datasets and extract useful information from them as preprocessing steps in any classification model to determine the dataset characteristics. Also, this paper provides a comparative st
... Show MoreTraffic classification is referred to as the task of categorizing traffic flows into application-aware classes such as chats, streaming, VoIP, etc. Most systems of network traffic identification are based on features. These features may be static signatures, port numbers, statistical characteristics, and so on. Current methods of data flow classification are effective, they still lack new inventive approaches to meet the needs of vital points such as real-time traffic classification, low power consumption, ), Central Processing Unit (CPU) utilization, etc. Our novel Fast Deep Packet Header Inspection (FDPHI) traffic classification proposal employs 1 Dimension Convolution Neural Network (1D-CNN) to automatically learn more representational c
... Show MoreWhenever, the Internet of Things (IoT) applications and devices increased, the capability of the its access frequently stressed. That can lead a significant bottleneck problem for network performance in different layers of an end point to end point (P2P) communication route. So, an appropriate characteristic (i.e., classification) of the time changing traffic prediction has been used to solve this issue. Nevertheless, stills remain at great an open defy. Due to of the most of the presenting solutions depend on machine learning (ML) methods, that though give high calculation cost, where they are not taking into account the fine-accurately flow classification of the IoT devices is needed. Therefore, this paper presents a new model bas
... Show MoreThis study is unique in this field. It represents a mix of three branches of technology: photometry, spectroscopy, and image processing. The work treats the image by treating each pixel in the image based on its color, where the color means a specific wavelength on the RGB line; therefore, any image will have many wavelengths from all its pixels. The results of the study are specific and identify the elements on the nucleus’s surface of a comet, not only the details but also their mapping on the nucleus. The work considered 12 elements in two comets (Temple 1 and 67P/Churyumoy-Gerasimenko). The elements have strong emission lines in the visible range, which were recognized by our MATLAB program in the treatment of the image. The percen
... Show MoreElectrical Discharge Machining (EDM) is a widespread Nontraditional Machining (NTM) processes for manufacturing of a complicated geometry or very hard metals parts that are difficult to machine by traditional machining operations. Electrical discharge machining is a material removal (MR) process characterized by using electrical discharge erosion. This paper discusses the optimal parameters of EDM on high-speed steel (HSS) AISI M2 as a workpiece using copper and brass as an electrode. The input parameters used for experimental work are current (10, 24 and 42 A), pulse on time (100, 150 and 200 µs), and pulse off time (4, 12 and 25 µs) that have effect on the material removal rate (MRR), electrode wear rate (EWR) and wear ratio (WR). A
... Show MoreThis research deals with a shrinking method concernes with the principal components similar to that one which used in the multiple regression “Least Absolute Shrinkage and Selection: LASS”. The goal here is to make an uncorrelated linear combinations from only a subset of explanatory variables that may have a multicollinearity problem instead taking the whole number say, (K) of them. This shrinkage will force some coefficients to equal zero, after making some restriction on them by some "tuning parameter" say, (t) which balances the bias and variance amount from side, and doesn't exceed the acceptable percent explained variance of these components. This had been shown by MSE criterion in the regression case and the percent explained v
... Show More