Compression is the reduction in size of data in order to save space or transmission time. For data transmission, compression can be performed on just the data content or on the entire transmission unit (including header data) depending on a number of factors. In this study, we considered the application of an audio compression method by using text coding where audio compression represented via convert audio file to text file for reducing the time to data transfer by communication channel. Approach: we proposed two coding methods are applied to optimizing the solution by using CFG. Results: we test our application by using 4-bit coding algorithm the results of this method show not satisfy then we proposed a new approach to compress audio files, by converting an audio file in to a text file and then compressing the new text file by using the common compression techniques is 6-bit coding algorithm its used to convert a digitized Audio file into a text file it is show that a good compression ratio between 15-35 %.
This research deals with the most famous existing definitions of public relations, in an attempt to achieve a definition that will be added to the other existing and widespread definitions, especially that big developments have taken place in the concept of public relations and their idiomatic use.
In addition, many definitions of public relations have been restricted to some limited descriptions, therefore many of the descriptions given to the public relations are basically definitions as well as characteristics of the public relations.
This research aims at setting and formulating the definitions of public relations, it also deals with its credibility in achieving its significance in order to reach a new definition that
... Show MoreThis paper deals with a central issue in the field of human communication and reveals the roaming monitoring of the incitement and hatred speech and violence in media, its language and its methods. In this paper, the researcher seeks to provide a scientific framework for the nature of the discourse of incitement, hatred speech, violence, and the role that media can play in solving conflicts with their different dimensions and in building community peace and preventing the emergence of conflicts among different parties and in different environments. In this paper, the following themes are discussed:
The root of the discourse of hatred and incitement
The nature and dimensions of the discourse of incitement and hatred speech
The n
The unstable and uncertain nature of natural rubber prices makes them highly volatile and prone to outliers, which can have a significant impact on both modeling and forecasting. To tackle this issue, the author recommends a hybrid model that combines the autoregressive (AR) and Generalized Autoregressive Conditional Heteroscedasticity (GARCH) models. The model utilizes the Huber weighting function to ensure the forecast value of rubber prices remains sustainable even in the presence of outliers. The study aims to develop a sustainable model and forecast daily prices for a 12-day period by analyzing 2683 daily price data from Standard Malaysian Rubber Grade 20 (SMR 20) in Malaysia. The analysis incorporates two dispersion measurements (I
... Show MoreThe research summarizes the knowledge of the dimensions and denotations of T.V advertisement; and its constituents for building it through the semiotic approach of an ad sample represented by the announcement of Zain Kuwait Telecom Company which carries the title "Mr. President" using Roland Barth's approach, starting with the designation, implicit, and linguistic reading to reach the narrative features and their denotations. That makes television advertising as a semiotic and pragmatic discourse in view of the still and motion picture with its efficiency and strength to inform and communicate. And what lies in it of aesthetic, artistic elements; informational and effective power in influencing the recipients by focusing on narratives and a
... Show MoreA resume is the first impression between you and a potential employer. Therefore, the importance of a resume can never be underestimated. Selecting the right candidates for a job within a company can be a daunting task for recruiters when they have to review hundreds of resumes. To reduce time and effort, we can use NLTK and Natural Language Processing (NLP) techniques to extract essential data from a resume. NLTK is a free, open source, community-driven project and the leading platform for building Python programs to work with human language data. To select the best resume according to the company’s requirements, an algorithm such as KNN is used. To be selected from hundreds of resumes, your resume must be one of the best. Theref
... Show MoreThe proposal of nonlinear models is one of the most important methods in time series analysis, which has a wide potential for predicting various phenomena, including physical, engineering and economic, by studying the characteristics of random disturbances in order to arrive at accurate predictions.
In this, the autoregressive model with exogenous variable was built using a threshold as the first method, using two proposed approaches that were used to determine the best cutting point of [the predictability forward (forecasting) and the predictability in the time series (prediction), through the threshold point indicator]. B-J seasonal models are used as a second method based on the principle of the two proposed approaches in dete
... Show More. In recent years, Bitcoin has become the most widely used blockchain platform in business and finance. The goal of this work is to find a viable prediction model that incorporates and perhaps improves on a combination of available models. Among the techniques utilized in this paper are exponential smoothing, ARIMA, artificial neural networks (ANNs) models, and prediction combination models. The study's most obvious discovery is that artificial intelligence models improve the results of compound prediction models. The second key discovery was that a strong combination forecasting model that responds to the multiple fluctuations that occur in the bitcoin time series and Error improvement should be used. Based on the results, the prediction a
... Show MoreIn recent years, Bitcoin has become the most widely used blockchain platform in business and finance. The goal of this work is to find a viable prediction model that incorporates and perhaps improves on a combination of available models. Among the techniques utilized in this paper are exponential smoothing, ARIMA, artificial neural networks (ANNs) models, and prediction combination models. The study's most obvious discovery is that artificial intelligence models improve the results of compound prediction models. The second key discovery was that a strong combination forecasting model that responds to the multiple fluctuations that occur in the bitcoin time series and Error improvement should be used. Based on the results, the prediction acc
... Show MoreRecently, the phenomenon of the spread of fake news or misinformation in most fields has taken on a wide resonance in societies. Combating this phenomenon and detecting misleading information manually is rather boring, takes a long time, and impractical. It is therefore necessary to rely on the fields of artificial intelligence to solve this problem. As such, this study aims to use deep learning techniques to detect Arabic fake news based on Arabic dataset called the AraNews dataset. This dataset contains news articles covering multiple fields such as politics, economy, culture, sports and others. A Hybrid Deep Neural Network has been proposed to improve accuracy. This network focuses on the properties of both the Text-Convolution Neural
... Show More