Artificial Neural Networks (ANN) is one of the important statistical methods that are widely used in a range of applications in various fields, which simulates the work of the human brain in terms of receiving a signal, processing data in a human cell and sending to the next cell. It is a system consisting of a number of modules (layers) linked together (input, hidden, output). A comparison was made between three types of neural networks (Feed Forward Neural Network (FFNN), Back propagation network (BPL), Recurrent Neural Network (RNN). he study found that the lowest false prediction rate was for the recurrentt network architecture and using the Data on graduate students at the College of Administration and Economics, University of Baghdad for the period from 2014-2015 to The academic year 2017-2018. The variables are use in the research is (student’s success, age, gender, job, type of study (higher diploma, master’s, doctorate), specialization (statistics, economics, accounting, industry management, administrative management, and public administration) and channel acceptance). It became clear that the best variables that affect the success of graduate students are the type of study, age and job.
The approach of the research is to simulate residual chlorine decay through potable water distribution networks of Gukookcity. EPANET software was used for estimating and predicting chlorine concentration at different water network points . Data requiredas program inputs (pipe properties) were taken from the Baghdad Municipality, factors that affect residual chlorine concentrationincluding (pH ,Temperature, pressure ,flow rate) were measured .Twenty five samples were tested from November 2016 to July 2017.The residual chlorine values varied between ( 0.2-2mg/L) , and pH values varied between (7.6 -8.2) and the pressure was very weak inthis region. Statistical analyses were used to evaluated errors. The calculated concentrations by the calib
... Show MoreTechnically, mobile P2P network system architecture can consider as a distributed architecture system (like a community), where the nodes or users can share all or some of their own software and hardware resources such as (applications store, processing time, storage, network bandwidth) with the other nodes (users) through Internet, and these resources can be accessible directly by the nodes in that system without the need of a central coordination node. The main structure of our proposed network architecture is that all the nodes are symmetric in their functions. In this work, the security issues of mobile P2P network system architecture such as (web threats, attacks and encryption) will be discussed deeply and then we prop
... Show MoreThis study is concerned with the comparison of the results of some tests of passing and dribbling of the basketball of tow different years between teams of chosen young players in Baghdad. Calculative methods were used namely (Arithmetic mean, Value digression and T.test for incompatible specimens). After careful calculative treatments, it has been that there were abstract or no abstract differences in the find results of chestpass, highdribble and cross-over dribble. The clubs were: (Al-Khark, Air defence, Police and Al-Adamiyah) each one separate from the other for the year (2000-2001). After all that many findings were reached such as the lack of objective valuation (periodical tests) between one sport season and the other. In the light
... Show MoreThe aim of study was making comparison in some kinematics variables in (100) meter butterfly swimming to first and second ranking in championship 2003 Espana, so noticed there is no such like this study in our country in comparison study for international champions therefore not specific and scientific discovering to these advanced levels, also the researchers depend on group of kinematics variables when the comparison making and it was included (50 meter the first, 50 meter the second, the differences between the first (50) meter and the second , more over basic variables in (100) meter butterfly , after having the results and treat it statistically the researchers reaches to two conclusions which was: • Success the first rank in startin
... Show MoreThe logistic regression model regarded as the important regression Models ,where of the most interesting subjects in recent studies due to taking character more advanced in the process of statistical analysis .
The ordinary estimating methods is failed in dealing with data that consist of the presence of outlier values and hence on the absence of such that have undesirable effect on the result. &nbs
... Show MoreError control schemes became a necessity in network-on-chip (NoC) to improve reliability as the on-chip interconnect errors increase with the continuous shrinking of geometry. Accordingly, many researchers are trying to present multi-bit error correction coding schemes that perform a high error correction capability with the simplest design possible to minimize area and power consumption. A recent work, Multi-bit Error Correcting Coding with Reduced Link Bandwidth (MECCRLB), showed a huge reduction in area and power consumption compared to a well-known scheme, namely, Hamming product code (HPC) with Type-II HARQ. Moreover, the authors showed that the proposed scheme can correct 11 random errors which is considered a high
... Show MoreMany of the dynamic processes in different sciences are described by models of differential equations. These models explain the change in the behavior of the studied process over time by linking the behavior of the process under study with its derivatives. These models often contain constant and time-varying parameters that vary according to the nature of the process under study in this We will estimate the constant and time-varying parameters in a sequential method in several stages. In the first stage, the state variables and their derivatives are estimated in the method of penalized splines(p- splines) . In the second stage we use pseudo lest square to estimate constant parameters, For the third stage, the rem
... Show MoreIn this paper, the deterministic and the stochastic models are proposed to study the interaction of the Coronavirus (COVID-19) with host cells inside the human body. In the deterministic model, the value of the basic reproduction number determines the persistence or extinction of the COVID-19. If , one infected cell will transmit the virus to less than one cell, as a result, the person carrying the Coronavirus will get rid of the disease .If the infected cell will be able to infect all cells that contain ACE receptors. The stochastic model proves that if are sufficiently large then maybe give us ultimate disease extinction although , and this facts also proved by computer simulation.