Tourism plays an important role in Malaysia’s economic development as it can boost business opportunity in its surrounding economic. By apply data mining on tourism data for predicting the area of business opportunity is a good choice. Data mining is the process that takes data as input and produces outputs knowledge. Due to the population of travelling in Asia country has increased in these few years. Many entrepreneurs start their owns business but there are some problems such as wrongly invest in the business fields and bad services quality which affected their business income. The objective of this paper is to use data mining technology to meet the business needs and customer needs of tourism enterprises and find the most effective data mining technology. Besides that, this paper implementation of 4 data mining classification techniques was experimented for extracting important insights from the tourism data set. The aims were to find out the best performing algorithm among the compared on the results to improve the business opportunities in the fields related to tourism. The results of the 4 classifiers correctly classifier the attributes were JRIP (84.09%), Random Tree (83.66%), J48 (85.50%), and REP Tree (82.47%). All the results will be analyzed and discussed in this paper.
The research aims at measuring the impact of visual media on the development of the tourism services sector, which is a field study for the public. It aims at determining the impact of visual media on tourism sector development, clarifying the concept of visual media and its functions and studying the role of media in the development of tourism culture among the public. For a sample of the employees of the University of Baghdad, consisting of (120) male and female. A questionnaire was prepared for this purpose consisting of (21) questions distributed to the sample. The data were analyzed and the hypotheses were tested using the statistical program spss to unload the results and calculate the frequencies, percentages and correlati
... Show MoreThe current study aims to compare between the assessments of the Rush model’s parameters to the missing and completed data in various ways of processing the missing data. To achieve the aim of the present study, the researcher followed the following steps: preparing Philip Carter test for the spatial capacity which consists of (20) items on a group of (250) sixth scientific stage students in the directorates of Baghdad Education at Al–Rusafa (1st, 2nd and 3rd) for the academic year (2018-2019). Then, the researcher relied on a single-parameter model to analyze the data. The researcher used Bilog-mg3 model to check the hypotheses, data and match them with the model. In addition
... Show MoreThis paper introduces a relationship between the independence of polynomials associated with the links of the network, and the Jacobian determinant of these polynomials. Also, it presents a way to simplify a given communication network through an algorithm that splits the network into subnets and reintegrates them into a network that is a general representation or model of the studied network. This model is also represented through a combination of polynomial equations and uses Groebner bases to reach a new simplified network equivalent to the given network, which may make studying the ability to solve the problem of network coding less expensive and much easier.
Steganography is a technique of concealing secret data within other quotidian files of the same or different types. Hiding data has been essential to digital information security. This work aims to design a stego method that can effectively hide a message inside the images of the video file. In this work, a video steganography model has been proposed through training a model to hiding video (or images) within another video using convolutional neural networks (CNN). By using a CNN in this approach, two main goals can be achieved for any steganographic methods which are, increasing security (hardness to observed and broken by used steganalysis program), this was achieved in this work as the weights and architecture are randomized. Thus,
... Show MoreThis paper aims to prove an existence theorem for Voltera-type equation in a generalized G- metric space, called the -metric space, where the fixed-point theorem in - metric space is discussed and its application. First, a new contraction of Hardy-Rogess type is presented and also then fixed point theorem is established for these contractions in the setup of -metric spaces. As application, an existence result for Voltera integral equation is obtained.
This research aims to address the most recent international standard in the field of insurance contracts, the International Financial Reporting Standard (IFRS17) and the theoretical framework of the standard in addition to the most important characteristics of the standard (IFRS17), as well as to identify the paragraphs of the modern standard, with the challenges its application in general and the use of the approach (inputs - operations - outputs) to present the challenges of its application in the Iraqi environment and specifically in the environment of Iraqi insurance sector companies (government), the research is based on the main premise that the identification of the requirements for the application of the International Fin
... Show MoreAbstract
The research Compared two methods for estimating fourparametersof the compound exponential Weibull - Poisson distribution which are the maximum likelihood method and the Downhill Simplex algorithm. Depending on two data cases, the first one assumed the original data (Non-polluting), while the second one assumeddata contamination. Simulation experimentswere conducted for different sample sizes and initial values of parameters and under different levels of contamination. Downhill Simplex algorithm was found to be the best method for in the estimation of the parameters, the probability function and the reliability function of the compound distribution in cases of natural and contaminateddata.
... Show More
This paper discusses estimating the two scale parameters of Exponential-Rayleigh distribution for singly type one censored data which is one of the most important Rights censored data, using the maximum likelihood estimation method (MLEM) which is one of the most popular and widely used classic methods, based on an iterative procedure such as the Newton-Raphson to find estimated values for these two scale parameters by using real data for COVID-19 was taken from the Iraqi Ministry of Health and Environment, AL-Karkh General Hospital. The duration of the study was in the interval 4/5/2020 until 31/8/2020 equivalent to 120 days, where the number of patients who entered the (study) hospital with sample size is (n=785). The number o
... Show More
The research draws its importance from identifying the methods of profit management in misleading the financial statements, which in turn is reflected in the decisions of the authorities that relied on these reports, and then the models that help in detecting those methods used by the auditors. Risks. The index (margin of excess cash) was used to detect profit management practices on a group of banks listed in the Iraqi market for securities and the number of (23) banks, including (12) commercial bank and (11) Islamic bank and the results were compared to commercial banks with Islamic banks.((The research started from the hypothesis that the use of the (excess cash margin) model in the banking sector reveals the management
... Show MoreThe aim of this research is to identify the extent to which the Conventional and Islamic banks are committed to implement the requirements of the corporate governance in its financial reports. In addition to its commitment to transparency and clarity in dealing with the shareholders and stockholders to protect their interests and to determine the impact of the commitment of the corporate governance on assessing the financial performance of the conventional and Islamic banks that participate in Bahrain Stock Exchange.