Tourism plays an important role in Malaysia’s economic development as it can boost business opportunity in its surrounding economic. By apply data mining on tourism data for predicting the area of business opportunity is a good choice. Data mining is the process that takes data as input and produces outputs knowledge. Due to the population of travelling in Asia country has increased in these few years. Many entrepreneurs start their owns business but there are some problems such as wrongly invest in the business fields and bad services quality which affected their business income. The objective of this paper is to use data mining technology to meet the business needs and customer needs of tourism enterprises and find the most effective data mining technology. Besides that, this paper implementation of 4 data mining classification techniques was experimented for extracting important insights from the tourism data set. The aims were to find out the best performing algorithm among the compared on the results to improve the business opportunities in the fields related to tourism. The results of the 4 classifiers correctly classifier the attributes were JRIP (84.09%), Random Tree (83.66%), J48 (85.50%), and REP Tree (82.47%). All the results will be analyzed and discussed in this paper.
The design of future will still be the most confusing and puzzling issue and misgivings that arouse worry and leading to the spirit of adventures to make progress and arrive at the ways of reviving, creativity and modernism. The idea of prevailing of a certain culture or certain product in design depends on the given and available techniques, due to the fact that the computer and their artistic techniques become very important and vital to reinforce the image in the design. Thus, it is very necessary to link between these techniques and suitable way to reform the mentality by which the design will be reformed, from what has been said, (there has no utilization for the whole modern and available graphic techniques in the design proce
... Show MoreCohesion is well known as the study of the relationships, whether grammatical and/or lexical, between the different elements of a particular text by the use of what are commonly called 'cohesive devices'. These devices bring connectivity and bind a text together. Besides, the nature and the amount of such cohesive devices usually affect the understanding of that text in the sense of making it easier to comprehend. The present study is intendedto examine the use of grammatical cohesive devicesin relation to narrative techniques. The story of Joseph from the Holy Quran has been selected to be examined by using Halliday and Hasan's Model of Cohesion (1976, 1989). The aim of the study is to comparatively examine to what extent the type
... Show MoreIn this study, we review the ARIMA (p, d, q), the EWMA and the DLM (dynamic linear moodelling) procedures in brief in order to accomdate the ac(autocorrelation) structure of data .We consider the recursive estimation and prediction algorithms based on Bayes and KF (Kalman filtering) techniques for correlated observations.We investigate the effect on the MSE of these procedures and compare them using generated data.
Abstract: -
The concept of joint integration of important concepts in macroeconomic application, the idea of cointegration is due to the Granger (1981), and he explained it in detail in Granger and Engle in Econometrica (1987). The introduction of the joint analysis of integration in econometrics in the mid-eighties of the last century, is one of the most important developments in the experimental method for modeling, and the advantage is simply the account and use it only needs to familiarize them selves with ordinary least squares.
Cointegration seen relations equilibrium time series in the long run, even if it contained all the sequences on t
... Show MoreIn the image processing’s field and computer vision it’s important to represent the image by its information. Image information comes from the image’s features that extracted from it using feature detection/extraction techniques and features description. Features in computer vision define informative data. For human eye its perfect to extract information from raw image, but computer cannot recognize image information. This is why various feature extraction techniques have been presented and progressed rapidly. This paper presents a general overview of the feature extraction categories for image.
Image compression is a serious issue in computer storage and transmission, that simply makes efficient use of redundancy embedded within an image itself; in addition, it may exploit human vision or perception limitations to reduce the imperceivable information Polynomial coding is a modern image compression technique based on modelling concept to remove the spatial redundancy embedded within the image effectively that composed of two parts, the mathematical model and the residual. In this paper, two stages proposed technqies adopted, that starts by utilizing the lossy predictor model along with multiresolution base and thresholding techniques corresponding to first stage. Latter by incorporating the near lossless com
... Show MoreFor the most reliable and reproducible results for calibration or general testing purposes of two immiscible liquids, such as water in engine oil, good emulsification is vital. This study explores the impact of emulsion quality on the Fourier transform infrared (FT-IR) spectroscopy calibration standards for measuring water contamination in used or in-service engine oil, in an attempt to strengthen the specific guidelines of ASTM International standards for sample preparation. By using different emulsification techniques and readily available laboratory equipment, this work is an attempt to establish the ideal sample preparation technique for reliability, repeatability, and reproducibility for FT-IR analysis while still considering t
... Show MoreIn the last two decades, networks had been changed according to the rapid changing in its requirements. The current Data Center Networks have large number of hosts (tens or thousands) with special needs of bandwidth as the cloud network and the multimedia content computing is increased. The conventional Data Center Networks (DCNs) are highlighted by the increased number of users and bandwidth requirements which in turn have many implementation limitations. The current networking devices with its control and forwarding planes coupling result in network architectures are not suitable for dynamic computing and storage needs. Software Defined networking (SDN) is introduced to change this notion of traditional networks by decoupling control and
... Show MoreAbstract:
The research seeks to identify the role of the International Assurance Standard (3402) in the auditor's procedures, as the importance of the research stems from providing assurance services for control tools through reports that are prepared according to this standard, which contribute to strengthening audit procedures through a proposed assurance program. Many conclusions were reached, the most important of which The assurance operations are considered among the operations with a special assignme
... Show MoreAbstract
The intellectual capital an important variable in the equation of the success of seeking economic units to achieve a competitive advantage, since it is a real capital for economic unity it constitutes a strategic importance as the main source to achieve high profitability in the economic unit and highlights the importance of showing intellectual capital components of the main Represent of human capital and structural capital and relational capital, through effective and the prominent role played by intellectual capital within the economic unity in order to achieve a sustainable competitive advantage contribute to attracting investors with longer investment decision of the most important and difficult decisions ta
... Show More