Forest fires continue to rise during the dry season and they are difficult to stop. In this case, high temperatures in the dry season can cause an increase in drought index that could potentially burn the forest every time. Thus, the government should conduct surveillance throughout the dry season. Continuous surveillance without the focus on a particular time becomes ineffective and inefficient because of preventive measures carried out without the knowledge of potential fire risk. Based on the Keetch-Byram Drought Index (KBDI), formulation of Drought Factor is used just for calculating the drought today based on current weather conditions, and yesterday's drought index. However, to find out the factors of drought a day after, the data is needed about the weather. Therefore, we need an algorithm that can predict the dryness factor. So, the most significant fire potential can be predicted during the dry season. Moreover, daily prediction of the dry season is needed each day to conduct the best action then a qualified preventive measure can be carried out. The method used in this study is the backpropagation algorithm which has functions for calculating, testing and training the drought factors. By using empirical data, some data are trained and then tested until it can be concluded that 100% of the data already well recognized. Furthermore, some other data tested without training, then the result is 60% of the data match. In general, this algorithm shows promising results and can be applied more to complete several variables supporters.
The first successful implementation of Artificial Neural Networks (ANNs) was published a little over a decade ago. It is time to review the progress that has been made in this research area. This paper provides taxonomy for classifying Field Programmable Gate Arrays (FPGAs) implementation of ANNs. Different implementation techniques and design issues are discussed, such as obtaining a suitable activation function and numerical truncation technique trade-off, the improvement of the learning algorithm to reduce the cost of neuron and in result the total cost and the total speed of the complete ANN. Finally, the implementation of a complete very fast circuit for the pattern of English Digit Numbers NN has four layers of 70 nodes (neurons) o
... Show MoreThe origin of this technique lies in the analysis of François Kenai (1694-1774), the leader of the School of Naturalists, presented in Tableau Economique. This method was developed by Karl Marx in his analysis of the Departmental Relationships and the nature of these relations in the models of " "He said. The current picture of this type of economic analysis is credited to the Russian economist Vasily Leontif. This analytical model is commonly used in developing economic plans in developing countries (p. 1, p. 86). There are several types of input and output models, such as static model, mobile model, regional models, and so on. However, this research will be confined to the open-ended model, which found areas in practical application.
... Show MoreThis research discusses application Artificial Neural Network (ANN) and Geographical InformationSystem (GIS) models on water quality of Diyala River using Water Quality Index (WQI). Fourteen water parameterswere used for estimating WQI: pH, Temperature, Dissolved Oxygen, Orthophosphate, Nitrate, Calcium, Magnesium,Total Hardness, Sodium, Sulphate, Chloride, Total Dissolved Solids, Electrical Conductivity and Total Alkalinity.These parameters were provided from the Water Resources Ministryfrom seven stations along the river for the period2011 to 2016. The results of WQI analysis revealed that Diyala River is good to poor at the north of Diyala provincewhile it is poor to very polluted at the south of Baghdad City. The selected parameters wer
... Show MoreAchieving an accurate and optimal rate of penetration (ROP) is critical for a cost-effective and safe drilling operation. While different techniques have been used to achieve this goal, each approach has limitations, prompting researchers to seek solutions. This study’s objective is to conduct the strategy of combining the Bourgoyne and Young (BYM) ROP equations with Bagging Tree regression in a southern Iraqi field. Although BYM equations are commonly used and widespread to estimate drilling rates, they need more specific drilling parameters to capture different ROP complexities. The Bagging Tree algorithm, a random forest variant, addresses these limitations by blending domain kno
In this research, we dealt with the study of the Non-Homogeneous Poisson process, which is one of the most important statistical issues that have a role in scientific development as it is related to accidents that occur in reality, which are modeled according to Poisson’s operations, because the occurrence of this accident is related to time, whether with the change of time or its stability. In our research, this clarifies the Non-Homogeneous hemispheric process and the use of one of these models of processes, which is an exponentiated - Weibull model that contains three parameters (α, β, σ) as a function to estimate the time rate of occurrence of earthquakes in Erbil Governorate, as the governorate is adjacent to two countr
... Show MoreThis study aimed at investigating the effect of using computer in
Efficiency of Training Programme of Science Teachers in Ajloun District in
Jordan.
1- What is the effect of using computer in program for the two groups
2- ( the experimental and control group ) .
3- Are there any statistics different in the effect of using computer
program for the two groups ?
4- Are there any statistics (comparison ) or different of the effect of the
effect of using computer program refer to the sex (male or female )?
The community of the study consisted of all the science student in
educational directorate of Ajloun district for the academic year 2009 –
2010, they are (120) ( male and female) . The sample of the study<
Variable selection is an essential and necessary task in the statistical modeling field. Several studies have triedto develop and standardize the process of variable selection, but it isdifficultto do so. The first question a researcher needs to ask himself/herself what are the most significant variables that should be used to describe a given dataset’s response. In thispaper, a new method for variable selection using Gibbs sampler techniqueshas beendeveloped.First, the model is defined, and the posterior distributions for all the parameters are derived.The new variable selection methodis tested usingfour simulation datasets. The new approachiscompared with some existingtechniques: Ordinary Least Squared (OLS), Least Absolute Shrinkage
... Show MoreThis paper describes the problem of online autonomous mobile robot path planning, which is consisted of finding optimal paths or trajectories for an autonomous mobile robot from a starting point to a destination across a flat map of a terrain, represented by a 2-D workspace. An enhanced algorithm for solving the problem of path planning using Bacterial Foraging Optimization algorithm is presented. This nature-inspired metaheuristic algorithm, which imitates the foraging behavior of E-coli bacteria, was used to find the optimal path from a starting point to a target point. The proposed algorithm was demonstrated by simulations in both static and dynamic different environments. A comparative study was evaluated between the developed algori
... Show MoreAbstract
The Purpose of This Research is The Main Factors In out Comes Phenomena From Primary School Which in Creased in Lost Period in Iraq And to Find Solutions to The This Problem.
In Order to Achieve Al The Aim The Research Choose a Systematic Random Sample of School Records For Students in Some Primary Schools in Karkh and Rusafa and Year of Study (2010-2015) and Size (40) Samples, included (16) Variable , Collected in Form Prepared by The Research As a Way to Analyze The Data.
Remember to Summarize The (6) Main components Pay a Student to Drop out of Primary Schools in The Province of Baghdad are Arranged As Follows:
... Show More