A three-stage learning algorithm for deep multilayer perceptron (DMLP) with effective weight initialisation based on sparse auto-encoder is proposed in this paper, which aims to overcome difficulties in training deep neural networks with limited training data in high-dimensional feature space. At the first stage, unsupervised learning is adopted using sparse auto-encoder to obtain the initial weights of the feature extraction layers of the DMLP. At the second stage, error back-propagation is used to train the DMLP by fixing the weights obtained at the first stage for its feature extraction layers. At the third stage, all the weights of the DMLP obtained at the second stage are refined by error back-propagation. Network structures and values of learning parameters are determined through cross-validation, and test datasets unseen in the cross-validation are used to evaluate the performance of the DMLP trained using the three-stage learning algorithm. Experimental results show that the proposed method is effective in combating overfitting in training deep neural networks.
Background: Neonatal Septicemia (NNS) is generalized microbial symptomatic infection during the first 28 days of life.It>s the most serious complication in Neonatal Intensive Care Units (NICU) that demand urgent diagnosis and accurate treatment.Objective: To reveal the relationship of neonatal septicemia with birth weight (one of the neonatal risk factors).Patients and Methods: Blood sample was obtained from 76 neonates aged 1 hour-28 days who were diagnosed clinically (poor feeding, respiratory distress, fever, hypothermia, gastrointestinal and/or central nervous system symptoms)and bacteriologically to have neonatal septicemia.Results:One of the most important neonatal factor predisposing to infection is low birth weight, signi
... Show MoreThis thesis study (pen weight and diversity of Arabic calligraphy), including the Arabic script went through multiple bodies, it came through the natural evolution of societies, and helped in the renovation and development of calligraphy after they gained a clear identity as a result of development that has occurred in the materials and writing instruments, especially industry pen that led to the diversity of Arabic calligraphy, and through the exploratory research and modeling study, which was obtained that the researcher could pose a problem discussed in the first chapter of his study follows by asking: is the pen is the weight of the role in the diversity of Arabic calligrap
... Show MoreAbstract: Data mining is become very important at the present time, especially with the increase in the area of information it's became huge, so it was necessary to use data mining to contain them and using them, one of the data mining techniques are association rules here using the Pattern Growth method kind enhancer for the apriori. The pattern growth method depends on fp-tree structure, this paper presents modify of fp-tree algorithm called HFMFFP-Growth by divided dataset and for each part take most frequent item in fp-tree so final nodes for conditional tree less than the original fp-tree. And less memory space and time.
The integration of decision-making will lead to the robust of its decisions, and then determination optimum inventory level to the required materials to produce and reduce the total cost by the cooperation of purchasing department with inventory department and also with other company,s departments. Two models are suggested to determine Optimum Inventory Level (OIL), the first model (OIL-model 1) assumed that the inventory level for materials quantities equal to the required materials, while the second model (OIL-model 2) assumed that the inventory level for materials quantities more than the required materials for the next period. &nb
... Show MoreNowadays, information systems constitute a crucial part of organizations; by losing security, these organizations will lose plenty of competitive advantages as well. The core point of information security (InfoSecu) is risk management. There are a great deal of research works and standards in security risk management (ISRM) including NIST 800-30 and ISO/IEC 27005. However, only few works of research focus on InfoSecu risk reduction, while the standards explain general principles and guidelines. They do not provide any implementation details regarding ISRM; as such reducing the InfoSecu risks in uncertain environments is painstaking. Thus, this paper applied a genetic algorithm (GA) for InfoSecu risk reduction in uncertainty. Finally, the ef
... Show MoreAs s widely use of exchanging private information in various communication applications, the issue to secure it became top urgent. In this research, a new approach to encrypt text message based on genetic algorithm operators has been proposed. The proposed approach follows a new algorithm of generating 8 bit chromosome to encrypt plain text after selecting randomly crossover point. The resulted child code is flipped by one bit using mutation operation. Two simulations are conducted to evaluate the performance of the proposed approach including execution time of encryption/decryption and throughput computations. Simulations results prove the robustness of the proposed approach to produce better performance for all evaluation metrics with res
... Show MoreIn this paper, we derive and prove the stability bounds of the momentum coefficient µ and the learning rate ? of the back propagation updating rule in Artificial Neural Networks .The theoretical upper bound of learning rate ? is derived and its practical approximation is obtained
The most popular medium that being used by people on the internet nowadays is video streaming. Nevertheless, streaming a video consumes much of the internet traffics. The massive quantity of internet usage goes for video streaming that disburses nearly 70% of the internet. Some constraints of interactive media might be detached; such as augmented bandwidth usage and lateness. The need for real-time transmission of video streaming while live leads to employing of Fog computing technologies which is an intermediary layer between the cloud and end user. The latter technology has been introduced to alleviate those problems by providing high real-time response and computational resources near to the
... Show More