Optimization is essentially the art, science and mathematics of choosing the best among a given set of finite or infinite alternatives. Though currently optimization is an interdisciplinary subject cutting through the boundaries of mathematics, economics, engineering, natural sciences, and many other fields of human Endeavour it had its root in antiquity. In modern day language the problem mathematically is as follows - Among all closed curves of a given length find the one that closes maximum area. This is called the Isoperimetric problem. This problem is now mentioned in a regular fashion in any course in the Calculus of Variations. However, most problems of antiquity came from geometry and since there were no general methods to solve such problems, each one of them was solved by very different approaches.
Journal of Theoretical and Applied Information Technology is a peer-reviewed electronic research papers & review papers journal with aim of promoting and publishing original high quality research dealing with theoretical and scientific aspects in all disciplines of IT (Informaiton Technology
This research dealt with the analysis of murder crime data in Iraq in its temporal and spatial dimensions, then it focused on building a new model with an algorithm that combines the characteristics associated with time and spatial series so that this model can predict more accurately than other models by comparing them with this model, which we called the Combined Regression model (CR), which consists of merging two models, the time series regression model with the spatial regression model, and making them one model that can analyze data in its temporal and spatial dimensions. Several models were used for comparison with the integrated model, namely Multiple Linear Regression (MLR), Decision Tree Regression (DTR), Random Forest Reg
... Show MoreIn this study, the first kind Bessel function was used to solve Kepler equation for an elliptical orbiting satellite. It is a classical method that gives a direct solution for calculation of the eccentric anomaly. It was solved for one period from (M=0-360)° with an eccentricity of (e=0-1) and the number of terms from (N=1-10). Also, the error in the representation of the first kind Bessel function was calculated. The results indicated that for eccentricity of (0.1-0.4) and (N = 1-10), the values of eccentric anomaly gave a good result as compared with the exact solution. Besides, the obtained eccentric anomaly values were unaffected by increasing the number of terms (N = 6-10) for eccentricities (0.8 and 0.9). The Bessel
... Show MoreNon uniform channelization is a crucial task in cognitive radio receivers for obtaining separate channels from the digitized wideband input signal at different intervals of time. The two main requirements in the channelizer are reconfigurability and low complexity. In this paper, a reconfigurable architecture based on a combination of Improved Coefficient Decimation Method (ICDM) and Coefficient Interpolation Method (CIM) is proposed. The proposed Hybrid Coefficient Decimation-Interpolation Method (HCDIM) based filter bank (FB) is able to realize the same number of channels realized using (ICDM) but with a maximum decimation factor divided by the interpolation factor (L), which leads to less deterioration in stop band at
... Show MoreThis paper proposes two hybrid feature subset selection approaches based on the combination (union or intersection) of both supervised and unsupervised filter approaches before using a wrapper, aiming to obtain low-dimensional features with high accuracy and interpretability and low time consumption. Experiments with the proposed hybrid approaches have been conducted on seven high-dimensional feature datasets. The classifiers adopted are support vector machine (SVM), linear discriminant analysis (LDA), and K-nearest neighbour (KNN). Experimental results have demonstrated the advantages and usefulness of the proposed methods in feature subset selection in high-dimensional space in terms of the number of selected features and time spe
... Show MoreDiscriminant analysis is a technique used to distinguish and classification an individual to a group among a number of groups based on a linear combination of a set of relevant variables know discriminant function. In this research discriminant analysis used to analysis data from repeated measurements design. We will deal with the problem of discrimination and classification in the case of two groups by assuming the Compound Symmetry covariance structure under the assumption of normality for univariate repeated measures data.
... Show More
Long memory analysis is one of the most active areas in econometrics and time series where various methods have been introduced to identify and estimate the long memory parameter in partially integrated time series. One of the most common models used to represent time series that have a long memory is the ARFIMA (Auto Regressive Fractional Integration Moving Average Model) which diffs are a fractional number called the fractional parameter. To analyze and determine the ARFIMA model, the fractal parameter must be estimated. There are many methods for fractional parameter estimation. In this research, the estimation methods were divided into indirect methods, where the Hurst parameter is estimated fir
... Show More