Cryptocurrency became an important participant on the financial market as it attracts large investments and interests. With this vibrant setting, the proposed cryptocurrency price prediction tool stands as a pivotal element providing direction to both enthusiasts and investors in a market that presents itself grounded on numerous complexities of digital currency. Employing feature selection enchantment and dynamic trio of ARIMA, LSTM, Linear Regression techniques the tool creates a mosaic for users to analyze data using artificial intelligence towards forecasts in real-time crypto universe. While users navigate the algorithmic labyrinth, they are offered a vast and glittering selection of high-quality cryptocurrencies to select. The ability of the tool in analyzing past data on historical prices combined with machine learning, orchestrate an appealing scene of predictions equipped with choices and information, users turn into the main characters in a financial discovery story conducted by the cryptocurrency system. The numerical results also support the effectiveness of the tool as highlighted by standout corresponding numbers such as lower RMSE value 150.96 for ETH and minimized normalized RMSE scaled down to under, which is. The quantitative successes underline the usefulness of this tool to give precise predictions and improve user interaction in an entertaining world of cryptocurrency investments.
Food comes after air and water in terms of importance in the survival of human beings, In addition, it is the support and strength of health and support, if lost or destroyed man would die or get sick and become a heavy burden on himself and his society. Food, like other sources of life, is subject to various risks and corruption comes from countless sources. Among these dangers is the result of spontaneousness, lack of knowledge or compulsion due to the interaction of variables beyond the will of the producer and the consumer, such as pollution of water, air and environment and their reflection on food consumed by people. However, we can’t deny that some reasons of corruption are intentional and resulting from a planning in advance in
... Show MoreThis article aims to provide a bibliometric analysis of intellectual capital research published in the Scopus database from 1956 to 2020 to trace the development of scientific activities that can pave the way for future studies by shedding light on the gaps in the field. The analysis focuses on 638 intellectual capital-related papers published in the Scopus database over 60 years, drawing upon a bibliometric analysis using VOSviewer. This paper highlights the mainstream of the current research in the intellectual capital field, based on the Scopus database, by presenting a detailed bibliometric analysis of the trend and development of intellectual capital research in the past six decades, including journals, authors, countries, inst
... Show MoreCurrent research aims to find out:
- Effect of using the active learning in the achievement of third grade intermediate students in mathematics.
- Effect of using of active learning in the tendency towards the study of mathematics for students of third grade intermediate.
In order to achieve the goals of the research, the researcher formulated the following two hypotheses null:
- There is no difference statistically significant at the level of significance (0.05) between two average of degrees to achievement
--The objective of the current research is to identify: 1) Preparing a scale level for e-learning applications, 2) What is the relationship between the applications of e-learning and the students of the Department of Chemistry at the Faculty of Education for Pure Sciences/ Ibn Al-Haytham – University of Baghdad. To achieve the research objectives, the researcher used the descriptive approach because of its suitability to the nature of the study objectives. The researcher built a scale for e-learning applications that consists of (40) items on the five-point Likrat scale (I agree, strongly agree, neutral, disagree, strongly disagree). He also adopted the scale of scientific values, and it consists of (40) items on a five-point scale as wel
... Show MoreShadow removal is crucial for robot and machine vision as the accuracy of object detection is greatly influenced by the uncertainty and ambiguity of the visual scene. In this paper, we introduce a new algorithm for shadow detection and removal based on different shapes, orientations, and spatial extents of Gaussian equations. Here, the contrast information of the visual scene is utilized for shadow detection and removal through five consecutive processing stages. In the first stage, contrast filtering is performed to obtain the contrast information of the image. The second stage involves a normalization process that suppresses noise and generates a balanced intensity at a specific position compared to the neighboring intensit
... Show Moremodel is derived, and the methodology is given in detail. The model is constructed depending on some measurement criteria, Akaike and Bayesian information criterion. For the new time series model, a new algorithm has been generated. The forecasting process, one and two steps ahead, is discussed in detail. Some exploratory data analysis is given in the beginning. The best model is selected based on some criteria; it is compared with some naïve models. The modified model is applied to a monthly chemical sales dataset (January 1992 to Dec 2019), where the dataset in this work has been downloaded from the United States of America census (www.census.gov). Ultimately, the forecasted sales
A multivariate multisite hydrological data forecasting model was derived and checked using a case study. The philosophy is to use simultaneously the cross-variable correlations, cross-site correlations and the time lag correlations. The case study is of two variables, three sites, the variables are the monthly rainfall and evaporation; the sites are Sulaimania, Dokan, and Darbandikhan.. The model form is similar to the first order auto regressive model, but in matrices form. A matrix for the different relative correlations mentioned above and another for their relative residuals were derived and used as the model parameters. A mathematical filter was used for both matrices to obtain the elements. The application of this model indicates i
... Show MoreThe purpose of this paper is to model and forecast the white oil during the period (2012-2019) using volatility GARCH-class. After showing that squared returns of white oil have a significant long memory in the volatility, the return series based on fractional GARCH models are estimated and forecasted for the mean and volatility by quasi maximum likelihood QML as a traditional method. While the competition includes machine learning approaches using Support Vector Regression (SVR). Results showed that the best appropriate model among many other models to forecast the volatility, depending on the lowest value of Akaike information criterion and Schwartz information criterion, also the parameters must be significant. In addition, the residuals
... Show MoreThis article presents the results of an experimental investigation of using carbon fiber–reinforced polymer sheets to enhance the behavior of reinforced concrete deep beams with large web openings in shear spans. A set of 18 specimens were fabricated and tested up to a failure to evaluate the structural performance in terms of cracking, deformation, and load-carrying capacity. All tested specimens were with 1500-mm length, 500-mm cross-sectional deep, and 150-mm wide. Parameters that studied were opening size, opening location, and the strengthening factor. Two deep beams were implemented as control specimens without opening and without strengthening. Eight deep beams were fabricated with openings but without strengthening, while
... Show More