Prediction of daily rainfall is important for flood forecasting, reservoir operation, and many other hydrological applications. The artificial intelligence (AI) algorithm is generally used for stochastic forecasting rainfall which is not capable to simulate unseen extreme rainfall events which become common due to climate change. A new model is developed in this study for prediction of daily rainfall for different lead times based on sea level pressure (SLP) which is physically related to rainfall on land and thus able to predict unseen rainfall events. Daily rainfall of east coast of Peninsular Malaysia (PM) was predicted using SLP data over the climate domain. Five advanced AI algorithms such as extreme learning machine (ELM), Bayesian regularized neural networks (BRNNs), Bayesian additive regression trees (BART), extreme gradient boosting (xgBoost), and hybrid neural fuzzy inference system (HNFIS) were used considering the complex relationship of rainfall with sea level pressure. Principle components of SLP domain correlated with daily rainfall were used as predictors. The results revealed that the efficacy of AI models is predicting daily rainfall one day before. The relative performance of the models revealed the higher performance of BRNN with normalized root mean square error (NRMSE) of 0.678 compared with HNFIS (NRMSE = 0.708), BART (NRMSE = 0.784), xgBoost (NRMSE = 0.803), and ELM (NRMSE = 0.915). Visual inspection of predicted rainfall during model validation using density-scatter plot and other novel ways of visual comparison revealed the ability of BRNN to predict daily rainfall one day before reliably.
In this paper we will investigate some Heuristic methods to solve travelling salesman problem. The discussed methods are Minimizing Distance Method (MDM), Branch and Bound Method (BABM), Tree Type Heuristic Method (TTHM) and Greedy Method (GRM).
The weak points of MDM are manipulated in this paper. The Improved MDM (IMDM) gives better results than classical MDM, and other discussed methods, while the GRM gives best time for 5≤ n ≤500, where n is the number of visited cities.
The main aim of image compression is to reduce the its size to be able for transforming and storage, therefore many methods appeared to compress the image, one of these methods is "Multilayer Perceptron ". Multilayer Perceptron (MLP) method which is artificial neural network based on the Back-Propagation algorithm for compressing the image. In case this algorithm depends upon the number of neurons in the hidden layer only the above mentioned will not be quite enough to reach the desired results, then we have to take into consideration the standards which the compression process depend on to get the best results. We have trained a group of TIFF images with the size of (256*256) in our research, compressed them by using MLP for each
... Show MoreThe objective of this study is to apply Artificial Neural Network for heat transfer analysis of shell-and-tube heat exchangers widely used in power plants and refineries. Practical data was obtained by using industrial heat exchanger operating in power generation department of Dura refinery. The commonly used Back Propagation (BP) algorithm was used to train and test networks by divided the data to three samples (training, validation and testing data) to give more approach data with actual case. Inputs of the neural network include inlet water temperature, inlet air temperature and mass flow rate of air. Two outputs (exit water temperature to cooling tower and exit air temperature to second stage of air compressor) were taken in ANN.
... Show MoreResearch on the automated extraction of essential data from an electrocardiography (ECG) recording has been a significant topic for a long time. The main focus of digital processing processes is to measure fiducial points that determine the beginning and end of the P, QRS, and T waves based on their waveform properties. The presence of unavoidable noise during ECG data collection and inherent physiological differences among individuals make it challenging to accurately identify these reference points, resulting in suboptimal performance. This is done through several primary stages that rely on the idea of preliminary processing of the ECG electrical signal through a set of steps (preparing raw data and converting them into files tha
... Show MoreThe investigation of signature validation is crucial to the field of personal authenticity. The biometrics-based system has been developed to support some information security features.Aperson’s signature, an essential biometric trait of a human being, can be used to verify their identification. In this study, a mechanism for automatically verifying signatures has been suggested. The offline properties of handwritten signatures are highlighted in this study which aims to verify the authenticity of handwritten signatures whether they are real or forged using computer-based machine learning techniques. The main goal of developing such systems is to verify people through the validity of their signatures. In this research, images of a group o
... Show MoreSewer sediment deposition is an important aspect as it relates to several operational and environmental problems. It concerns municipalities as it affects the sewer system and contributes to sewer failure which has a catastrophic effect if happened in trunks or interceptors. Sewer rehabilitation is a costly process and complex in terms of choosing the method of rehabilitation and individual sewers to be rehabilitated. For such a complex process, inspection techniques assist in the decision-making process; though, it may add to the total expenditure of the project as it requires special tools and trained personnel. For developing countries, Inspection could prohibit the rehabilitation proceeds. In this study, the researchers propos
... Show MoreThis study was conducted in College of Science \ Computer Science Department \ University of Baghdad to compare between automatic sorting and manual sorting, which is more efficient and accurate, as well as the use of artificial intelligence in automated sorting, which included artificial neural network, image processing, study of external characteristics, defects and impurities and physical characteristics; grading and sorting speed, and fruits weigh. the results shown value of impurities and defects. the highest value of the regression is 0.40 and the error-approximation algorithm has recorded the value 06-1 and weight fruits fruit recorded the highest value and was 138.20 g, Gradin
Three-dimensional (3D) reconstruction from images is a most beneficial method of object regeneration by using a photo-realistic way that can be used in many fields. For industrial fields, it can be used to visualize the cracks within alloys or walls. In medical fields, it has been used as 3D scanner to reconstruct some human organs such as internal nose for plastic surgery or to reconstruct ear canal for fabricating a hearing aid device, and others. These applications need high accuracy details and measurement that represent the main issue which should be taken in consideration, also the other issues are cost, movability, and ease of use which should be taken into consideration. This work has presented an approach for design and construc
... Show MoreIn this paper reliable computational methods (RCMs) based on the monomial stan-dard polynomials have been executed to solve the problem of Jeffery-Hamel flow (JHF). In addition, convenient base functions, namely Bernoulli, Euler and Laguerre polynomials, have been used to enhance the reliability of the computational methods. Using such functions turns the problem into a set of solvable nonlinear algebraic system that MathematicaⓇ12 can solve. The JHF problem has been solved with the help of Improved Reliable Computational Methods (I-RCMs), and a review of the methods has been given. Also, published facts are used to make comparisons. As further evidence of the accuracy and dependability of the proposed methods, the maximum error remainder
... Show MoreScheduling Timetables for courses in the big departments in the universities is a very hard problem and is often be solved by many previous works although results are partially optimal. This work implements the principle of an evolutionary algorithm by using genetic theories to solve the timetabling problem to get a random and full optimal timetable with the ability to generate a multi-solution timetable for each stage in the collage. The major idea is to generate course timetables automatically while discovering the area of constraints to get an optimal and flexible schedule with no redundancy through the change of a viable course timetable. The main contribution in this work is indicated by increasing the flexibility of generating opti
... Show More