The Ant System Algorithm (ASA) is a member of the ant colony algorithms family in swarm intelligence methods (part of the Artificial Intelligence field), which is based on the behavior of ants seeking a path and a source of food in their colonies. The aim of This algorithm is to search for an optimal solution for Combinational Optimization Problems (COP) for which is extremely difficult to find solution using the classical methods like linear and non-linear programming methods.
The Ant System Algorithm was used in the management of water resources field in Iraq, specifically for Haditha dam which is one of the most important dams in Iraq. The target is to find out an efficient management system for
... Show MoreBackground: One of the major problems in endodontics is micro-leakage of root canal fillings which might contribute to the failure of endodontic treatment. To avoid this problem, a variety of sealers have been tested. The objective of this, in vitro, study was to evaluate the shear bond strength of four resin based sealers (AH plus, silver free AH26, RealSeal SE and Perma Evolution permanent root canal filling material) to dentin. Materials and Methods: Forty non-carious extracted lower premolars were used. The 2mm of the occlusal surfaces of teeth were sectioned, to expose the dentin surface. The exposed dentin surfaces of teeth were washed with 5ml of 2.5% NaOCl solution followed by 5ml of 17 % EDTA then rinsed by deionized water to remov
... Show MoreThe research aims to build a list of digital citizenship axes and standards and indicators emanating from them, which should be included in the content of the computer textbook scheduled for second grade intermediate students in Iraq, and the analysis of the above mentioned book according to the same list using the descriptive analytical method ((method of content analysis)). The research community and its sample consisted of the content of the computer textbook scheduled for the second year intermediate students for the academic year 2018-2019, and the research tool was built in its initial form after reference to a set of specialized literature and previous studies that dealt with topics related to digital citizenship, and the authenticit
... Show MoreThe study aims to analyze computer textbooks content for preparatory stage according to the logical thinking. The researcher followed the descriptive analytical research approach (content analysis), and adopted an explicit idea during the analysis process. One of the content analysis tools which was designed based on mental processes employed during logical thinking has utilized to figure out the study results. The findings revealed that logical thinking skills formed (52%) in fourth preparatory textbook and (47%) in fifth preparatory textbook.
The objective of this research is to analyze the content of science textbook at the elementary level, according to the dimensions of sustainable development for the academic year (2015-2016). To achieve this goal has been to build a list with dimensions of sustainable development to be included in science textbooks in primary school, after seeing the collection of literature and research and studies, as has been reached to the list of the dimensions of the three sustainable development and social, economic and environmental in the initial image consisted of (63) the issue of sub-divided the three-dimensional, the menu and offered a group of arbitrators and specialists in curriculum and teaching methods, and thus the menu consiste
... Show MoreSeveral stress-strain models were used to predict the strengths of steel fiber reinforced concrete, which are distinctive of the material. However, insufficient research has been done on the influence of hybrid fiber combinations (comprising two or more distinct fibers) on the characteristics of concrete. For this reason, the researchers conducted an experimental program to determine the stress-strain relationship of 30 concrete samples reinforced with two distinct fibers (a hybrid of polyvinyl alcohol and steel fibers), with compressive strengths ranging from 40 to 120 MPa. A total of 80% of the experimental results were used to develop a new empirical stress-strain model, which was accomplished through the application of the parti
... Show MoreEntropy define as uncertainty measure has been transfared by using the cumulative distribution function and reliability function for the Burr type – xii. In the case of data which suffer from volatility to build a model the probability distribution on every failure of a sample after achieving limitations function, probabilistic distribution. Has been derived formula probability distribution of the new transfer application entropy on the probability distribution of continuous Burr Type-XII and tested a new function and found that it achieved the conditions function probability, been derived mean and function probabilistic aggregate in order to be approved in the generation of data for the purpose of implementation of simulation
... Show MoreThis research aims to present a proposed model for disclosure and documentation when performing the audit according to the joint audit method by using the questions and principles of the collective intelligence system, which leads to improving and enhancing the efficiency of the joint audit, and thus enhancing the confidence of the parties concerned in the outputs of the audit process. As the research problem can be formulated through the following question: “Does the proposed model for disclosure of the role of the collective intelligence system contribute to improving joint auditing?”
The proposed model is designed for the disclosure of joint auditing and the role
... Show MoreAbstract
The research Compared two methods for estimating fourparametersof the compound exponential Weibull - Poisson distribution which are the maximum likelihood method and the Downhill Simplex algorithm. Depending on two data cases, the first one assumed the original data (Non-polluting), while the second one assumeddata contamination. Simulation experimentswere conducted for different sample sizes and initial values of parameters and under different levels of contamination. Downhill Simplex algorithm was found to be the best method for in the estimation of the parameters, the probability function and the reliability function of the compound distribution in cases of natural and contaminateddata.
... Show More