Currently, one of the topical areas of application of machine learning methods is the prediction of material characteristics. The aim of this work is to develop machine learning models for determining the rheological properties of polymers from experimental stress relaxation curves. The paper presents an overview of the main directions of metaheuristic approaches (local search, evolutionary algorithms) to solving combinatorial optimization problems. Metaheuristic algorithms for solving some important combinatorial optimization problems are described, with special emphasis on the construction of decision trees. A comparative analysis of algorithms for solving the regression problem in CatBoost Regressor has been carried out. The object of the study is the generated data sets obtained on the basis of theoretical stress relaxation curves. Tables of initial data for training models for all samples are presented, a statistical analysis of the characteristics of the initial data sets is carried out. The total number of numerical experiments for all samples was 346020 variations. When developing the models, CatBoost artificial intelligence methods were used, regularization methods (Weight Decay, Decoupled Weight Decay Regularization, Augmentation) were used to improve the accuracy of the model, and the Z-Score method was used to normalize the data. As a result of the study, intelligent models were developed to determine the rheological parameters of polymers included in the generalized non-linear Maxwell-Gurevich equation (initial relaxation viscosity, velocity modulus) using generated data sets for the EDT-10 epoxy binder as an example. Based on the results of testing the models, the quality of the models was assessed, graphs of forecasts for trainees and test samples, graphs of forecast errors were plotted. Intelligent models are based on the CatBoost algorithm and implemented in the Jupyter Notebook environment in Python. The constructed models have passed the quality assessment according to the following metrics: MAE, MSE, RMSE, MAPE. The maximum value of model error predictions was 0.86 for the MAPE metric, and the minimum value of model error predictions was 0.001 for the MSE metric. Model performance estimates obtained during testing are valid.
The Dirichlet process is an important fundamental object in nonparametric Bayesian modelling, applied to a wide range of problems in machine learning, statistics, and bioinformatics, among other fields. This flexible stochastic process models rich data structures with unknown or evolving number of clusters. It is a valuable tool for encoding the true complexity of real-world data in computer models. Our results show that the Dirichlet process improves, both in distribution density and in signal-to-noise ratio, with larger sample size; achieves slow decay rate to its base distribution; has improved convergence and stability; and thrives with a Gaussian base distribution, which is much better than the Gamma distribution. The performance depen
... Show MoreThis research aims to study the methods of reduction of dimensions that overcome the problem curse of dimensionality when traditional methods fail to provide a good estimation of the parameters So this problem must be dealt with directly . Two methods were used to solve the problem of high dimensional data, The first method is the non-classical method Slice inverse regression ( SIR ) method and the proposed weight standard Sir (WSIR) method and principal components (PCA) which is the general method used in reducing dimensions, (SIR ) and (PCA) is based on the work of linear combinations of a subset of the original explanatory variables, which may suffer from the problem of heterogeneity and the problem of linear
... Show MoreIn this paper, the process of comparison between the tree regression model and the negative binomial regression. As these models included two types of statistical methods represented by the first type "non parameter statistic" which is the tree regression that aims to divide the data set into subgroups, and the second type is the "parameter statistic" of negative binomial regression, which is usually used when dealing with medical data, especially when dealing with large sample sizes. Comparison of these methods according to the average mean squares error (MSE) and using the simulation of the experiment and taking different sample
... Show MoreA Stereomicroscopic Evaluation of Four Endodontic Sealers Penetration into Artificial Lateral Canals Using Gutta-Percha Single Cone Obturation Technique, Omar Jihad Banawi*, Raghad
The physical sports sector in Iraq suffers from the problem of achieving sports achievements in individual and team games in various Asian and international competitions, for many reasons, including the lack of exploitation of modern, accurate and flexible technologies and means, especially in the field of information technology, especially the technology of artificial neural networks. The main goal of this study is to build an intelligent mathematical model to predict sport achievement in pole vaulting for men, the methodology of the research included the use of five variables as inputs to the neural network, which are Avarage of Speed (m/sec in Before distance 05 meters latest and Distance 05 meters latest, The maximum speed achieved in t
... Show MoreMass transfer was studied using a rotating cylinder electrode with different lengths of legs acting as turbulence promoters. Two types of rotating cylinder ,made of brass, were examined : an enhanced cylinder one, with four rectangular extensions 10 mm long, 10 mm wide, and 1mm thick, and an enhanced cylinder two with four longitudes 30 mm long,10 mm wide, and 1mm thick. The best performance was obtained for enhanced cylinder two at low rotation speeds while enhanced cylinder one was realized at high rotation speeds. The mass transfer enhancement as compared with a normal rotating cylinder electrode, devoid of promoters, is 53% or 58% higher. The enhancement percentage decreased as rotation speeds increased further, since, seemingly, ful
... Show MoreRationing is a commonly used solution for shortages of resources and goods that are vital for the citizens of a country. This paper identifies some common approaches and policies used in rationing as well asrisks that associated to suggesta system for rationing fuelwhichcan work efficiently. Subsequently, addressing all possible security risks and their solutions. The system should theoretically be applicable in emergency situations, requiring less than three months to implement at a low cost and minimal changes to infrastructure.
With the increased development in digital media and communication, the need for methods to protection and security became very important factor, where the exchange and transmit date over communication channel led to make effort to protect these data from unauthentication access.
This paper present a new method to protect color image from unauthentication access using watermarking. The watermarking algorithm hide the encoded mark image in frequency domain using Discrete Cosine Transform. The main principle of the algorithm is encode frequent mark in cover color image. The watermark image bits are spread by repeat the mark and arrange in encoded method that provide algorithm more robustness and security. The propos
... Show MoreThe present paper describes and analyses three proposed cogeneration plants include back pressure steam-turbine system, gas turbine system, diesel-engine system, and the present Dura refinery plant. Selected actual operating data are employed for analysis. The same amount of electrical and thermal product outputs is considered for all systems to facilitate comparisons. The theoretical analysis was done according to 1st and 2nd law of thermodynamic. The results demonstrate that exergy analysis is a useful tool in performance analysis of cogeneration systems and permits meaningful comparisons of different cogeneration systems based on their merits, also the result showed that the back pressure steam-turbine is more efficient than other pro
... Show More