A three-stage learning algorithm for deep multilayer perceptron (DMLP) with effective weight initialisation based on sparse auto-encoder is proposed in this paper, which aims to overcome difficulties in training deep neural networks with limited training data in high-dimensional feature space. At the first stage, unsupervised learning is adopted using sparse auto-encoder to obtain the initial weights of the feature extraction layers of the DMLP. At the second stage, error back-propagation is used to train the DMLP by fixing the weights obtained at the first stage for its feature extraction layers. At the third stage, all the weights of the DMLP obtained at the second stage are refined by error back-propagation. Network structures and values of learning parameters are determined through cross-validation, and test datasets unseen in the cross-validation are used to evaluate the performance of the DMLP trained using the three-stage learning algorithm. Experimental results show that the proposed method is effective in combating overfitting in training deep neural networks.
A geographic information system (GIS) is a very effective management and analysis tool. Geographic locations rely on data. The use of artificial neural networks (ANNs) for the interpretation of natural resource data has been shown to be beneficial. Back-propagation neural networks are one of the most widespread and prevalent designs. The combination of geographic information systems with artificial neural networks provides a method for decreasing the cost of landscape change studies by shortening the time required to evaluate data. Numerous designs and kinds of ANNs have been created; the majority of them are PC-based service domains. Using the ArcGIS Network Analyst add-on, you can locate service regions around any network
... Show MoreGumbel distribution was dealt with great care by researchers and statisticians. There are traditional methods to estimate two parameters of Gumbel distribution known as Maximum Likelihood, the Method of Moments and recently the method of re-sampling called (Jackknife). However, these methods suffer from some mathematical difficulties in solving them analytically. Accordingly, there are other non-traditional methods, like the principle of the nearest neighbors, used in computer science especially, artificial intelligence algorithms, including the genetic algorithm, the artificial neural network algorithm, and others that may to be classified as meta-heuristic methods. Moreover, this principle of nearest neighbors has useful statistical featu
... Show MoreThe research aimed at identifying the effect of using constructive learning model on academic achievement and learning soccer dribbling Skill in 2nd grade secondary school students. The researcher used the experimental method on (30) secondary school students; 10 selected for pilot study, 20 were divided into two groups. The experimental group followed constructive learning model while the controlling group followed the traditional method. The experimental program lasted for eight weeks with two teaching sessions per week for each group. The data was collected and treated using SPSS to conclude the positive effect of using constructive learning model on developing academic achievement and learning soccer dribbling Skill in 2nd grade seconda
... Show MoreIn this research thin films from SnO2 semiconductor have been prepared by using chemical pyrolysis spray method from solution SnCl2.2H2O at 0.125M concentration on glass at substrate temperature (723K ).Annealing was preformed for prepared thin film at (823K) temperature. The structural and sensing properties of SnO2 thin films for CO2 gas was studied before and after annealing ,as well as we studied the effect temperature annealing on grain size for prepared thin films .
Objective: This study aims to examine how implementing Extensible Business Reporting Language (XBRL) enhances the efficiency and quality of environmental audits and sustainability reporting in eco-friendly universities. Aligned with Sustainable Development Goal 12 (Responsible Consumption and Production), the study emphasizes promoting transparency and precision in sustainability reporting to encourage responsible management of resources within academic institutions. Theoretical Framework: The importance of our study is evident in the importance of accurate and transparent reports in the development of environmental performance with theories of sustainable reporting and environmental auditing. One of the most important digital
... Show MoreIn the pandemic era of COVID19, software engineering and artificial intelligence tools played a major role in monitoring, managing, and predicting the spread of the virus. According to reports released by the World Health Organization, all attempts to prevent any form of infection are highly recommended among people. One side of avoiding infection is requiring people to wear face masks. The problem is that some people do not incline to wear a face mask, and guiding them manually by police is not easy especially in a large or public area to avoid this infection. The purpose of this paper is to construct a software tool called Face Mask Detection (FMD) to detect any face that does not wear a mask in a specific
... Show Morethe pursue of social systems history present to us solid evidence that the collapse of that systems be caused by either the stagnancy aftermath maturity or unreal intellectual foundation which lead to sudden collapse, while the capitalism can avoided that intellectual damages due to its dynamic system with appropriate auto adaptation mechanism and use it excellently in the right time.
The globalization had excrete (as one of the capitalism adaptation mechanism) its own targets and its methods in framework of multinationals corporations which consist with capitalism states that employed the international organizations to reconstruction the global economy to serve such targets. So the glob
... Show More