The present work aims to validate the experimental results of a new test rig built from scratch to evaluate the thermal behavior of the brake system with the numerical results of the transient thermal problem. The work was divided into two parts; in the first part, a three-dimensional finite-element solution of the transient thermal problem using a new developed 3D model of the brake system for the selected vehicle is SAIPA 131, while in the second part, the experimental test rig was built to achieve the necessary tests to find the temperature distribution during the braking process of the brake system. We obtained high agreement between the results of the new test rig with the numerical results based on the developed model of the brake system. It was found in some cases the local zones with extreme heat generated in contacting surfaces due to the non-uniformity of the contact pressure during the braking process, where this phenomenon can be led to an increase in the magnitudes of thermal stresses. It was found that the most significant factor on the level of generated temperatures (heat generation) is the initial vehicle's velocity. Furthermore, it was found that the maximum difference between the experimental and numerical results was not exceeding 6%.
In practical engineering problems, uncertainty exists not only in external excitations but also in structural parameters. This study investigates the influence of structural geometry, elastic modulus, mass density, and section dimension uncertainty on the stochastic earthquake response of portal frames subjected to random ground motions. The North-South component of the El Centro earthquake in 1940 in California is selected as the ground excitation. Using the power spectral density function, the two-dimensional finite element model of the portal frame’s base motion is modified to account for random ground motions. A probabilistic study of the portal frame structure using stochastic finite elements utilizing Monte Carlo simulation
... Show MoreIn this paper, two of the local search algorithms are used (genetic algorithm and particle swarm optimization), in scheduling number of products (n jobs) on a single machine to minimize a multi-objective function which is denoted as (total completion time, total tardiness, total earliness and the total late work). A branch and bound (BAB) method is used for comparing the results for (n) jobs starting from (5-18). The results show that the two algorithms have found the optimal and near optimal solutions in an appropriate times.
In this paper, the methods of weighted residuals: Collocation Method (CM), Least Squares Method (LSM) and Galerkin Method (GM) are used to solve the thin film flow (TFF) equation. The weighted residual methods were implemented to get an approximate solution to the TFF equation. The accuracy of the obtained results is checked by calculating the maximum error remainder functions (MER). Moreover, the outcomes were examined in comparison with the 4th-order Runge-Kutta method (RK4) and good agreements have been achieved. All the evaluations have been successfully implemented by using the computer system Mathematica®10.
String matching is seen as one of the essential problems in computer science. A variety of computer applications provide the string matching service for their end users. The remarkable boost in the number of data that is created and kept by modern computational devices influences researchers to obtain even more powerful methods for coping with this problem. In this research, the Quick Search string matching algorithm are adopted to be implemented under the multi-core environment using OpenMP directive which can be employed to reduce the overall execution time of the program. English text, Proteins and DNA data types are utilized to examine the effect of parallelization and implementation of Quick Search string matching algorithm on multi-co
... Show MoreThis paper provides a four-stage Trigonometrically Fitted Improved Runge-Kutta (TFIRK4) method of four orders to solve oscillatory problems, which contains an oscillatory character in the solutions. Compared to the traditional Runge-Kutta method, the Improved Runge-Kutta (IRK) method is a natural two-step method requiring fewer steps. The suggested method extends the fourth-order Improved Runge-Kutta (IRK4) method with trigonometric calculations. This approach is intended to integrate problems with particular initial value problems (IVPs) using the set functions and for trigonometrically fitted. To improve the method's accuracy, the problem primary frequency is used. The novel method is more accurate than the conventional Runge-Ku
... Show MoreThe rapid and enormous growth of the Internet of Things, as well as its widespread adoption, has resulted in the production of massive quantities of data that must be processed and sent to the cloud, but the delay in processing the data and the time it takes to send it to the cloud has resulted in the emergence of fog, a new generation of cloud in which the fog serves as an extension of cloud services at the edge of the network, reducing latency and traffic. The distribution of computational resources to minimize makespan and running costs is one of the disadvantages of fog computing. This paper provides a new approach for improving the task scheduling problem in a Cloud-Fog environme
The operation and management of water resources projects have direct and significant effects on the optimum use of water. Artificial intelligence techniques are a new tool used to help in making optimized decisions, based on knowledge bases in the planning, implementation, operation and management of projects as well as controlling flowing water quantities to prevent flooding and storage of excess water and use it during drought.
In this research, an Expert System was designed for operating and managing the system of AthTharthar Lake (ESSTAR). It was applied for all expected conditions of flow, including the cases of drought, normal flow, and during floods. Moreover, the cases of hypothetical op
... Show MoreThis research aims to choose the appropriate probability distribution to the reliability analysis for an item through collected data for operating and stoppage time of the case study.
Appropriate choice for .probability distribution is when the data look to be on or close the form fitting line for probability plot and test the data for goodness of fit .
Minitab’s 17 software was used for this purpose after arranging collected data and setting it in the the program.
&nb
... Show MoreIn this study, a different design of passive air Solar Chimney(SC)was tested by installing it in the south wall of insulated test room in Baghdad city. The SC was designed from vertical and inclined parts connected serially together, the vertical SC (first part) has a single pass and Thermal Energy Storage Box Collector (TESB (refined paraffin wax as Phase Change Material(PCM)-Copper Foam Matrix(CFM))), while the inclined SC was designed in single pass, double passes and double pass with TESB (semi refined paraffin wax with copper foam matrix) with selective working angle ((30o, 45o and 60o). A computational model was employed and solved by Finite Volume Method (FVM) to simulate the air i
... Show More