Different ANN architectures of MLP have been trained by BP and used to analyze Landsat TM images. Two different approaches have been applied for training: an ordinary approach (for one hidden layer M-H1-L & two hidden layers M-H1-H2-L) and one-against-all strategy (for one hidden layer (M-H1-1)xL, & two hidden layers (M-H1-H2-1)xL). Classification accuracy up to 90% has been achieved using one-against-all strategy with two hidden layers architecture. The performance of one-against-all approach is slightly better than the ordinary approach
Within the framework of big data, energy issues are highly significant. Despite the significance of energy, theoretical studies focusing primarily on the issue of energy within big data analytics in relation to computational intelligent algorithms are scarce. The purpose of this study is to explore the theoretical aspects of energy issues in big data analytics in relation to computational intelligent algorithms since this is critical in exploring the emperica aspects of big data. In this chapter, we present a theoretical study of energy issues related to applications of computational intelligent algorithms in big data analytics. This work highlights that big data analytics using computational intelligent algorithms generates a very high amo
... Show More
Shear and compressional wave velocities, coupled with other petrophysical data, are vital in determining the dynamic modules magnitude in geomechanical studies and hydrocarbon reservoir characterization. But, due to field practices and high running cost, shear wave velocity may not available in all wells. In this paper, a statistical multivariate regression method is presented to predict the shear wave velocity for Khasib formation - Amara oil fields located in South- East of Iraq using well log compressional wave velocity, neutron porosity and density. The accuracy of the proposed correlation have been compared to other correlations. The results show that, the presented model provides accurate
... Show MoreAutomatic Programming Assessment (APA) has been gaining lots of attention among researchers mainly to support automated grading and marking of students’ programming assignments or exercises systematically. APA is commonly identified as a method that can enhance accuracy, efficiency and consistency as well as providing instant feedback on students’ programming solutions. In achieving APA, test data generation process is very important so as to perform a dynamic testing on students’ assignment. In software testing field, many researches that focus on test data generation have demonstrated the successful of adoption of Meta-Heuristic Search Techniques (MHST) so as to enhance the procedure of deriving adequate test data for efficient t
... Show MoreSurvival analysis is widely applied in data describing for the life time of item until the occurrence of an event of interest such as death or another event of understudy . The purpose of this paper is to use the dynamic approach in the deep learning neural network method, where in this method a dynamic neural network that suits the nature of discrete survival data and time varying effect. This neural network is based on the Levenberg-Marquardt (L-M) algorithm in training, and the method is called Proposed Dynamic Artificial Neural Network (PDANN). Then a comparison was made with another method that depends entirely on the Bayes methodology is called Maximum A Posterior (MAP) method. This method was carried out using numerical algorithms re
... Show MoreSince the beginning of the last century, the competition for water resources has intensified dramatically, especially between countries that have no agreements in place for water resources that they share. Such is the situation with the Euphrates River which flows through three countries (Turkey, Syria, and Iraq) and represents the main water resource for these countries. Therefore, the comprehensive hydrologic investigation needed to derive optimal operations requires reliable forecasts. This study aims to analysis and create a forecasting model for data generation from Turkey perspective by using the recorded inflow data of Ataturk reservoir for the period (Oct. 1961 - Sep. 2009). Based on 49 years of real inflow data
... Show MoreIn the current paradigms of information technology, cloud computing is the most essential kind of computer service. It satisfies the need for high-volume customers, flexible computing capabilities for a range of applications like as database archiving and business analytics, and the requirement for extra computer resources to provide a financial value for cloud providers. The purpose of this investigation is to assess the viability of doing data audits remotely inside a cloud computing setting. There includes discussion of the theory behind cloud computing and distributed storage systems, as well as the method of remote data auditing. In this research, it is mentioned to safeguard the data that is outsourced and stored in cloud serv
... Show MoreThe expansion in water projects implementations in Turkey and Syria becomes of great concern to the workers in the field of water resources management in Iraq. Such expansion with the absence of bi-lateral agreement between the three riparian countries of Tigris and Euphrates Rivers; Turkey, Syria and Iraq, is expected to lead to a substantially reduction of water inflow to the territories of Iraq. Accordingly, this study consists of two parts: first part is aiming to study the changes of the water inflow to the territory of Iraq, at Turkey and Syria borders, from 1953 to 2009; the results indicated that the annual mean inflow in Tigris River was decreased from 677 m3/sec to 526 m3/sec, after operating Turkey reserv
... Show MoreThe aim of this study is to estimate the parameters and reliability function for kumaraswamy distribution of this two positive parameter (a,b > 0), which is a continuous probability that has many characterstics with the beta distribution with extra advantages.
The shape of the function for this distribution and the most important characterstics are explained and estimated the two parameter (a,b) and the reliability function for this distribution by using the maximum likelihood method (MLE) and Bayes methods. simulation experiments are conducts to explain the behaviour of the estimation methods for different sizes depending on the mean squared error criterion the results show that the Bayes is bet
... Show MoreIn recent years, data centre (DC) networks have improved their rapid exchanging abilities. Software-defined networking (SDN) is presented to alternate the impression of conventional networks by segregating the control plane from the SDN data plane. The SDN presented overcomes the limitations of traditional DC networks caused by the rapidly incrementing amounts of apps, websites, data storage needs, etc. Software-defined networking data centres (SDN-DC), based on the open-flow (OF) protocol, are used to achieve superior behaviour for executing traffic load-balancing (LB) jobs. The LB function divides the traffic-flow demands between the end devices to avoid links congestion. In short, SDN is proposed to manage more operative configur
... Show MorePermeability data has major importance work that should be handled in all reservoir simulation studies. The importance of permeability data increases in mature oil and gas fields due to its sensitivity for the requirements of some specific improved recoveries. However, the industry has a huge source of data of air permeability measurements against little number of liquid permeability values. This is due to the relatively high cost of special core analysis.
The current study suggests a correlation to convert air permeability data that are conventionally measured during laboratory core analysis into liquid permeability. This correlation introduces a feasible estimation in cases of data loose and poorly consolidated formations, or in cas