During the two last decades ago, audio compression becomes the topic of many types of research due to the importance of this field which reflecting on the storage capacity and the transmission requirement. The rapid development of the computer industry increases the demand for audio data with high quality and accordingly, there is great importance for the development of audio compression technologies, lossy and lossless are the two categories of compression. This paper aims to review the techniques of the lossy audio compression methods, summarize the importance and the uses of each method.
Recommender Systems are tools to understand the huge amount of data available in the internet world. Collaborative filtering (CF) is one of the most knowledge discovery methods used positively in recommendation system. Memory collaborative filtering emphasizes on using facts about present users to predict new things for the target user. Similarity measures are the core operations in collaborative filtering and the prediction accuracy is mostly dependent on similarity calculations. In this study, a combination of weighted parameters and traditional similarity measures are conducted to calculate relationship among users over Movie Lens data set rating matrix. The advantages and disadvantages of each measure are spotted. From the study, a n
... Show MoreThe rapid development of telemedicine services and the requirements for exchanging medical information between physicians, consultants, and health institutions have made the protection of patients’ information an important priority for any future e-health system. The protection of medical information, including the cover (i.e. medical image), has a specificity that slightly differs from the requirements for protecting other information. It is necessary to preserve the cover greatly due to its importance on the reception side as medical staff use this information to provide a diagnosis to save a patient's life. If the cover is tampered with, this leads to failure in achieving the goal of telemedicine. Therefore, this work provides an in
... Show MoreAdministrative procedures in various organizations produce numerous crucial records and data. These
records and data are also used in other processes like customer relationship management and accounting
operations.It is incredibly challenging to use and extract valuable and meaningful information from these data
and records because they are frequently enormous and continuously growing in size and complexity.Data
mining is the act of sorting through large data sets to find patterns and relationships that might aid in the data
analysis process of resolving business issues. Using data mining techniques, enterprises can forecast future
trends and make better business decisions.The Apriori algorithm has bee
Support Vector Machines (SVMs) are supervised learning models used to examine data sets in order to classify or predict dependent variables. SVM is typically used for classification by determining the best hyperplane between two classes. However, working with huge datasets can lead to a number of problems, including time-consuming and inefficient solutions. This research updates the SVM by employing a stochastic gradient descent method. The new approach, the extended stochastic gradient descent SVM (ESGD-SVM), was tested on two simulation datasets. The proposed method was compared with other classification approaches such as logistic regression, naive model, K Nearest Neighbors and Random Forest. The results show that the ESGD-SVM has a
... Show Moresummary
In this search, we examined the factorial experiments and the study of the significance of the main effects, the interaction of the factors and their simple effects by the F test (ANOVA) for analyze the data of the factorial experience. It is also known that the analysis of variance requires several assumptions to achieve them, Therefore, in case of violation of one of these conditions we conduct a transform to the data in order to match or achieve the conditions of analysis of variance, but it was noted that these transfers do not produce accurate results, so we resort to tests or non-parametric methods that work as a solution or alternative to the parametric tests , these method
... Show MoreThis study focuses on evaluating the suitability of three interpolation methods in terms of their accuracy at climate data for some provinces of south of Iraq. Two data sets of maximum and minimum temperature in February 2008 from nine meteorological stations located in the south of Iraq using three interpolation methods. ArcGIS is used to produce the spatially distributed temperature data by using IDW, ordinary kriging, and spline. Four statistical methods are applied to analyze the results obtained from three interpolation methods. These methods are RMSE, RMSE as a percentage of the mean, Model efficiency (E) and Bias, which showed that the ordinary krigingis the best for this data from other methods by the results that have b
... Show MorePermeability is an essential parameter in reservoir characterization because it is determined hydrocarbon flow patterns and volume, for this reason, the need for accurate and inexpensive methods for predicting permeability is important. Predictive models of permeability become more attractive as a result.
A Mishrif reservoir in Iraq's southeast has been chosen, and the study is based on data from four wells that penetrate the Mishrif formation. This study discusses some methods for predicting permeability. The conventional method of developing a link between permeability and porosity is one of the strategies. The second technique uses flow units and a flow zone indicator (FZI) to predict the permeability of a rock mass u
... Show MoreAbstract
The multiple linear regression model of the important regression models used in the analysis for different fields of science Such as business, economics, medicine and social sciences high in data has undesirable effects on analysis results . The multicollinearity is a major problem in multiple linear regression. In its simplest state, it leads to the departure of the model parameter that is capable of its scientific properties, Also there is an important problem in regression analysis is the presence of high leverage points in the data have undesirable effects on the results of the analysis , In this research , we present some of
... Show MoreIn this research, we dealt with the study of the Non-Homogeneous Poisson process, which is one of the most important statistical issues that have a role in scientific development as it is related to accidents that occur in reality, which are modeled according to Poisson’s operations, because the occurrence of this accident is related to time, whether with the change of time or its stability. In our research, this clarifies the Non-Homogeneous hemispheric process and the use of one of these models of processes, which is an exponentiated - Weibull model that contains three parameters (α, β, σ) as a function to estimate the time rate of occurrence of earthquakes in Erbil Governorate, as the governorate is adjacent to two countr
... Show MoreMeasuring the level of communicative competence in news headlines and the level of stylistic and semantic processing in its formulation requires creating a quantitative scale based on the bases on building the scales and their standards. As judging by scientific of journalism studies lies in the possibility of quantifying the journalistic knowledge, i.e. the ability of this knowledge to shift from qualitative language to its equivalent in the language of numbers.
News headlines and editorial processing are one of the journalistic knowledges that should be studied, analyzed stylistically and semantically; their conclusions drawn and expressed in numbers. Press knowledge is divided into two types:<
... Show More