In this paper, the effective computational method (ECM) based on the standard monomial polynomial has been implemented to solve the nonlinear Jeffery-Hamel flow problem. Moreover, novel effective computational methods have been developed and suggested in this study by suitable base functions, namely Chebyshev, Bernstein, Legendre, and Hermite polynomials. The utilization of the base functions converts the nonlinear problem to a nonlinear algebraic system of equations, which is then resolved using the Mathematica®12 program. The development of effective computational methods (D-ECM) has been applied to solve the nonlinear Jeffery-Hamel flow problem, then a comparison between the methods has been shown. Furthermore, the maximum error remainder ( ) has been calculated to exhibit the reliability of the suggested methods. The results persuasively prove that ECM and D-ECM are accurate, effective, and reliable in getting approximate solutions to the problem.
The inverse kinematic equation for a robot is very important to the control robot’s motion and position. The solving of this equation is complex for the rigid robot due to the dependency of this equation on the joint configuration and structure of robot link. In light robot arms, where the flexibility exists, the solving of this problem is more complicated than the rigid link robot because the deformation variables (elongation and bending) are present in the forward kinematic equation. The finding of an inverse kinematic equation needs to obtain the relation between the joint angles and both of the end-effector position and deformations variables. In this work, a neural network has been proposed to solve the problem of inverse kinemati
... Show MoreThe purpose of this study was to investigate the effect of a Cognitive- Behavioral Training Program in reducing Problems Solving among a sample of education university College Students, the study sample consisted of (50) students were randomly assigned to two groups: experimental, and control; (25) students per group, the results of (ANOVA) revealed that there were significant differences at (p < 0.05) between experimental and control group in Problems Solving level, while there were significant differences between both groups in achievement. The researchers recommended further studies on the other variables which after training students on the method of solving problems and techniques to reduce stress.<
... Show MoreFlexible job-shop scheduling problem (FJSP) is one of the instances in flexible manufacturing systems. It is considered as a very complex to control. Hence generating a control system for this problem domain is difficult. FJSP inherits the job-shop scheduling problem characteristics. It has an additional decision level to the sequencing one which allows the operations to be processed on any machine among a set of available machines at a facility. In this article, we present Artificial Fish Swarm Algorithm with Harmony Search for solving the flexible job shop scheduling problem. It is based on the new harmony improvised from results obtained by artificial fish swarm algorithm. This improvised solution is sent to comparison to an overall best
... Show MoreExponential distribution is one of most common distributions in studies and scientific researches with wide application in the fields of reliability, engineering and in analyzing survival function therefore the researcher has carried on extended studies in the characteristics of this distribution.
In this research, estimation of survival function for truncated exponential distribution in the maximum likelihood methods and Bayes first and second method, least square method and Jackknife dependent in the first place on the maximum likelihood method, then on Bayes first method then comparing then using simulation, thus to accomplish this task, different size samples have been adopted by the searcher us
... Show MoreA new method based on the Touchard polynomials (TPs) was presented for the numerical solution of the linear Fredholm integro-differential equation (FIDE) of the first order and second kind with condition. The derivative and integration of the (TPs) were simply obtained. The convergence analysis of the presented method was given and the applicability was proved by some numerical examples. The results obtained in this method are compared with other known results.
The concept of meaning is one of the most important topics that have occupied the mind of the recipient and critic in all the arts, especially the plastic arts, where we find that the art of contemporary plastic art, in particular sculpture has multiple readings and many critics differed in terms of different reading and views of the same artistic achievement.
This research will identify the different works of contemporary Iraqi sculptors while presenting and studying their works, as well as (studying the problem of meaning) of the artistic achievements of the sculptors in particular. The various parties interested in Iraqi sculpture did not seize the problem of the objective and subjective meaning of contemporary Iraqi sculpture
... Show MoreIn this paper was discussed the process of compounding two distributions using new compounding procedure which is connect a number of life time distributions ( continuous distribution ) where is the number of these distributions represent random variable distributed according to one of the discrete random distributions . Based on this procedure have been compounding zero – truncated poisson distribution with weibell distribution to produce new life time distribution having three parameter , Advantage of that failure rate function having many cases ( increasing , dicreasing , unimodal , bathtube) , and study the resulting distribution properties such as : expectation , variance , comulative function , reliability function and fa
... Show MoreIn this work, results from an optical technique (laser speckle technique) for measuring surface roughness was done by using statistical properties of speckle pattern from the point of view of computer image texture analysis. Four calibration relationships were used to cover wide range of measurement with the same laser speckle technique. The first one is based on intensity contrast of the speckle, the second is based on analysis of speckle binary image, the third is on size of speckle pattern spot, and the latest one is based on characterization of the energy feature of the gray level co-occurrence matrices for the speckle pattern. By these calibration relationships surface roughness of an object surface can be evaluated within the
... Show More