Image classification is the process of finding common features in images from various classes and applying them to categorize and label them. The main problem of the image classification process is the abundance of images, the high complexity of the data, and the shortage of labeled data, presenting the key obstacles in image classification. The cornerstone of image classification is evaluating the convolutional features retrieved from deep learning models and training them with machine learning classifiers. This study proposes a new approach of “hybrid learning” by combining deep learning with machine learning for image classification based on convolutional feature extraction using the VGG-16 deep learning model and seven class
... Show MoreThe research aims to build a list of digital citizenship axes and standards and indicators emanating from them, which should be included in the content of the computer textbook scheduled for second grade intermediate students in Iraq, and the analysis of the above mentioned book according to the same list using the descriptive analytical method ((method of content analysis)). The research community and its sample consisted of the content of the computer textbook scheduled for the second year intermediate students for the academic year 2018-2019, and the research tool was built in its initial form after reference to a set of specialized literature and previous studies that dealt with topics related to digital citizenship, and the authenticit
... Show MoreThe purpose of this research is to design a list of the scientific and moral values that should be found in the content of the computer textbook for the second intermediate grade, as well as to analyze the content of the above- mentioned book by answering the following question:
What is the percentage of availability of scientific and moral values in the content of the computer textbook for Second Intermediate grade issued by the Iraqi Ministry of Education / the general directorate of the curriculum, for the academic year (2017-2018)?
In order to achieve the research objectives, the descriptive method (content analysis method) was adopted. The research community has been iden
... Show MoreThis paper is concerned with Double Stage Shrinkage Bayesian (DSSB) Estimator for lowering the mean squared error of classical estimator ˆ q for the scale parameter (q) of an exponential distribution in a region (R) around available prior knowledge (q0) about the actual value (q) as initial estimate as well as to reduce the cost of experimentations. In situation where the experimentations are time consuming or very costly, a Double Stage procedure can be used to reduce the expected sample size needed to obtain the estimator. This estimator is shown to have smaller mean squared error for certain choice of the shrinkage weight factor y( ) and for acceptance region R. Expression for
... Show MoreWith the growth of mobile phones, short message service (SMS) became an essential text communication service. However, the low cost and ease use of SMS led to an increase in SMS Spam. In this paper, the characteristics of SMS spam has studied and a set of features has introduced to get rid of SMS spam. In addition, the problem of SMS spam detection was addressed as a clustering analysis that requires a metaheuristic algorithm to find the clustering structures. Three differential evolution variants viz DE/rand/1, jDE/rand/1, jDE/best/1, are adopted for solving the SMS spam problem. Experimental results illustrate that the jDE/best/1 produces best results over other variants in terms of accuracy, false-positive rate and false-negative
... Show MoreAnalysis the economic and financial phenomena and other requires to build the appropriate model, which represents the causal relations between factors. The operation building of the model depends on Imaging conditions and factors surrounding an in mathematical formula and the Researchers target to build that formula appropriately. Classical linear regression models are an important statistical tool, but used in a limited way, where is assumed that the relationship between the variables illustrations and response variables identifiable. To expand the representation of relationships between variables that represent the phenomenon under discussion we used Varying Coefficient Models
... Show MoreA condense study was done to compare between the ordinary estimators. In particular the maximum likelihood estimator and the robust estimator, to estimate the parameters of the mixed model of order one, namely ARMA(1,1) model.
Simulation study was done for a varieties the model. using: small, moderate and large sample sizes, were some new results were obtained. MAPE was used as a statistical criterion for comparison.
The time series of statistical methods mission followed in this area analysis method, Figuring certain displayed on a certain period of time and analysis we can identify the pattern and the factors affecting them and use them to predict the future of the phenomenon of values, which helps to develop a way of predicting the development of the economic development of sound
The research aims to select the best model to predict the number of infections with hepatitis Alvairose models using Box - Jenkins non-seasonal forecasting in the future.
Data were collected from the Ministry of Health / Department of Health Statistics for the period (from January 2009 until December 2013) was used
... Show MoreIn this paper reliable computational methods (RCMs) based on the monomial stan-dard polynomials have been executed to solve the problem of Jeffery-Hamel flow (JHF). In addition, convenient base functions, namely Bernoulli, Euler and Laguerre polynomials, have been used to enhance the reliability of the computational methods. Using such functions turns the problem into a set of solvable nonlinear algebraic system that MathematicaⓇ12 can solve. The JHF problem has been solved with the help of Improved Reliable Computational Methods (I-RCMs), and a review of the methods has been given. Also, published facts are used to make comparisons. As further evidence of the accuracy and dependability of the proposed methods, the maximum error remainder
... Show MoreAbstract
Bivariate time series modeling and forecasting have become a promising field of applied studies in recent times. For this purpose, the Linear Autoregressive Moving Average with exogenous variable ARMAX model is the most widely used technique over the past few years in modeling and forecasting this type of data. The most important assumptions of this model are linearity and homogenous for random error variance of the appropriate model. In practice, these two assumptions are often violated, so the Generalized Autoregressive Conditional Heteroscedasticity (ARCH) and (GARCH) with exogenous varia
... Show More