Currently, one of the topical areas of application of machine learning methods is the prediction of material characteristics. The aim of this work is to develop machine learning models for determining the rheological properties of polymers from experimental stress relaxation curves. The paper presents an overview of the main directions of metaheuristic approaches (local search, evolutionary algorithms) to solving combinatorial optimization problems. Metaheuristic algorithms for solving some important combinatorial optimization problems are described, with special emphasis on the construction of decision trees. A comparative analysis of algorithms for solving the regression problem in CatBoost Regressor has been carried out. The object of the study is the generated data sets obtained on the basis of theoretical stress relaxation curves. Tables of initial data for training models for all samples are presented, a statistical analysis of the characteristics of the initial data sets is carried out. The total number of numerical experiments for all samples was 346020 variations. When developing the models, CatBoost artificial intelligence methods were used, regularization methods (Weight Decay, Decoupled Weight Decay Regularization, Augmentation) were used to improve the accuracy of the model, and the Z-Score method was used to normalize the data. As a result of the study, intelligent models were developed to determine the rheological parameters of polymers included in the generalized non-linear Maxwell-Gurevich equation (initial relaxation viscosity, velocity modulus) using generated data sets for the EDT-10 epoxy binder as an example. Based on the results of testing the models, the quality of the models was assessed, graphs of forecasts for trainees and test samples, graphs of forecast errors were plotted. Intelligent models are based on the CatBoost algorithm and implemented in the Jupyter Notebook environment in Python. The constructed models have passed the quality assessment according to the following metrics: MAE, MSE, RMSE, MAPE. The maximum value of model error predictions was 0.86 for the MAPE metric, and the minimum value of model error predictions was 0.001 for the MSE metric. Model performance estimates obtained during testing are valid.
Permeability data has major importance work that should be handled in all reservoir simulation studies. The importance of permeability data increases in mature oil and gas fields due to its sensitivity for the requirements of some specific improved recoveries. However, the industry has a huge source of data of air permeability measurements against little number of liquid permeability values. This is due to the relatively high cost of special core analysis.
The current study suggests a correlation to convert air permeability data that are conventionally measured during laboratory core analysis into liquid permeability. This correlation introduces a feasible estimation in cases of data loose and poorly consolidated formations, or in cas
Abstract— The growing use of digital technologies across various sectors and daily activities has made handwriting recognition a popular research topic. Despite the continued relevance of handwriting, people still require the conversion of handwritten copies into digital versions that can be stored and shared digitally. Handwriting recognition involves the computer's strength to identify and understand legible handwriting input data from various sources, including document, photo-graphs and others. Handwriting recognition pose a complexity challenge due to the diversity in handwriting styles among different individuals especially in real time applications. In this paper, an automatic system was designed to handwriting recognition
... Show MoreIn this paper, we implement and examine a Simulink model with electroencephalography (EEG) to control many actuators based on brain waves. This will be in great demand since it will be useful for certain individuals who are unable to access some control units that need direct contact with humans. In the beginning, ten volunteers of a wide range of (20-66) participated in this study, and the statistical measurements were first calculated for all eight channels. Then the number of channels was reduced by half according to the activation of brain regions within the utilized protocol and the processing time also decreased. Consequently, four of the participants (three males and one female) were chosen to examine the Simulink model during di
... Show MoreData security is an important component of data communication and transmission systems. Its main role is to keep sensitive information safe and integrated from the sender to the receiver. The proposed system aims to secure text messages through two security principles encryption and steganography. The system produced a novel method for encryption using graph theory properties; it formed a graph from a password to generate an encryption key as a weight matrix of that graph and invested the Least Significant Bit (LSB) method for hiding the encrypted message in a colored image within a green component. Practical experiments of (perceptibility, capacity, and robustness) were calculated using similarity measures like PSNR, MSE, and
... Show MoreMedical image segmentation is one of the most actively studied fields in the past few decades, as the development of modern imaging modalities such as magnetic resonance imaging (MRI) and computed tomography (CT), physicians and technicians nowadays have to process the increasing number and size of medical images. Therefore, efficient and accurate computational segmentation algorithms become necessary to extract the desired information from these large data sets. Moreover, sophisticated segmentation algorithms can help the physicians delineate better the anatomical structures presented in the input images, enhance the accuracy of medical diagnosis and facilitate the best treatment planning. Many of the proposed algorithms could perform w
... Show MoreThe paper presents a neural synchronization into intensive study in order to address challenges preventing from adopting it as an alternative key exchange algorithm. The results obtained from the implementation of neural synchronization with this proposed system address two challenges: namely the verification of establishing the synchronization between the two neural networks, and the public initiation of the input vector for each party. Solutions are presented and mathematical model is developed and presented, and as this proposed system focuses on stream cipher; a system of LFSRs (linear feedback shift registers) has been used with a balanced memory to generate the key. The initializations of these LFSRs are neural weights after achiev
... Show MoreThe adsorption isotherms and kinetic uptakes of Carbon Dioxide (CO2) on fabricated electrospun nonwoven activated carbon nanofiber sheets were investigated at two different temperatures, 308 K and 343 K, over a pressure range of 1 to 7 bar. The activated carbon nanofiber-based on polymer (PAN) precursor was fabricated via electrospinning technique followed by thermal treatment to obtain the carboneous nanofibers. The obtained data of CO2 adsorption isotherm was fitted to various models, including Langmuir, Freundlich, and Temkin. Based on correlation coefficients, the Langmuir isotherm model presented the best fitting with CO2 adsorption isotherms’ experimental data. Raising the equ
Individuals across different industries, including but not limited to agriculture, drones, pharmaceuticals and manufacturing, are increasingly using thermal cameras to achieve various safety and security goals. This widespread adoption is made possible by advancements in thermal imaging sensor technology. The current literature provides an in-depth exploration of thermography camera applications for detecting faults in sectors such as fire protection, manufacturing, aerospace, automotive, non-destructive testing and structural material industries. The current discussion builds on previous studies, emphasising the effectiveness of thermography cameras in distinguishing undetectable defects by the human eye. Various methods for defect
... Show MoreIn this paper, we deal with games of fuzzy payoffs problem while there is uncertainty in data. We use the trapezoidal membership function to transform the data into fuzzy numbers and utilize the three different ranking function algorithms. Then we compare between these three ranking algorithms by using trapezoidal fuzzy numbers for the decision maker to get the best gains