It is so much noticeable that initialization of architectural parameters has a great impact on whole learnability stream so that knowing mathematical properties of dataset results in providing neural network architecture a better expressivity and capacity. In this paper, five random samples of the Volve field dataset were taken. Then a training set was specified and the persistent homology of the dataset was calculated to show impact of data complexity on selection of multilayer perceptron regressor (MLPR) architecture. By using the proposed method that provides a well-rounded strategy to compute data complexity. Our method is a compound algorithm composed of the t-SNE method, alpha-complexity algorithm, and a persistence barcode reading method to extract the Betti number of a dataset. After that, MLPR were trained using that dataset using a single hidden layer with increased hidden neurons. Then, increased both hidden layers and hidden neurons. Our empirical analysis has shown that the training efficiency of MLPR severely depends on its architecture’s ability to express the homology of the dataset.
Abstract:
We can notice cluster data in social, health and behavioral sciences, so this type of data have a link between its observations and we can express these clusters through the relationship between measurements on units within the same group.
In this research, I estimate the reliability function of cluster function by using the seemingly unrelate
... Show MoreAbstracts:
The Central Bank is the backbone of the banking system as a whole, and in order to maintain the banking system, one of the most important functions that the Central Bank performs is the function of supervising and controlling banks, with several tools and methods, and one of the most important of these tools is its creation of the function of a compliance observer, which obligated commercial banks to appoint a person in A bank that performs this function according to certain conditions and granting it some powers that would build a sound and compliant banking system. The function of the compliance observer is to follow up on the bank’s compliance with the instructions and decisions issued by
... Show MoreIris research is focused on developing techniques for identifying and locating relevant biometric features, accurate segmentation and efficient computation while lending themselves to compression methods. Most iris segmentation methods are based on complex modelling of traits and characteristics which, in turn, reduce the effectiveness of the system being used as a real time system. This paper introduces a novel parameterized technique for iris segmentation. The method is based on a number of steps starting from converting grayscale eye image to a bit plane representation, selection of the most significant bit planes followed by a parameterization of the iris location resulting in an accurate segmentation of the iris from the origin
... Show MoreHome Computer and Information Science 2009 Chapter The Stochastic Network Calculus Methodology Deah J. Kadhim, Saba Q. Jobbar, Wei Liu & Wenqing Cheng Chapter 568 Accesses 1 Citations Part of the Studies in Computational Intelligence book series (SCI,volume 208) Abstract The stochastic network calculus is an evolving new methodology for backlog and delay analysis of networks that can account for statistical multiplexing gain. This paper advances the stochastic network calculus by deriving a network service curve, which expresses the service given to a flow by the network as a whole in terms of a probabilistic bound. The presented network service curve permits the calculation of statistical end-to-end delay and backlog bounds for broad
... Show More3D models delivered from digital photogrammetric techniques have massively increased and developed to meet the requirements of many applications. The reliability of these models is basically dependent on the data processing cycle and the adopted tool solution in addition to data quality. Agisoft PhotoScan is a professional image-based 3D modelling software, which seeks to create orderly, precise n 3D content from fixed images. It works with arbitrary images those qualified in both controlled and uncontrolled conditions. Following the recommendations of many users all around the globe, Agisoft PhotoScan, has become an important source to generate precise 3D data for different applications. How reliable is this data for accurate 3D mo
... Show MoreDeep learning techniques are applied in many different industries for a variety of purposes. Deep learning-based item detection from aerial or terrestrial photographs has become a significant research area in recent years. The goal of object detection in computer vision is to anticipate the presence of one or more objects, along with their classes and bounding boxes. The YOLO (You Only Look Once) modern object detector can detect things in real-time with accuracy and speed. A neural network from the YOLO family of computer vision models makes one-time predictions about the locations of bounding rectangles and classification probabilities for an image. In layman's terms, it is a technique for instantly identifying and recognizing
... Show MoreData security is a fundamental parameter on communication system development. The capability of protecting and securing the information is a great essence for the growth of the data security and electronic commerce. The cryptography has a significant influence upon information security systems against the variety of the attacks, in which higher complexity in secret keys results in the increase of security and the cryptography algorithms’ complexity. The sufficient and newer cryptographic methods’ versions may helpful in the reduction of the security attacks. The main aim of this research is satisfying the purpose of the information security through the addition of a new security level to the Advanced Encryption Standard (AES) algorithm
... Show MoreThis research aim at finding out the impact of using the strategy of shape V on the achievement of the students of the Department of Artistic Education in the subject of "Principles of scientific Research ". This strategy is one of the cognitive strategies used in this topic . To verify the aim of the research ,the two researchers have put the following null hypothesis:-There are no significant differences on the level of 0,05 among the average degrees of the students of the experimental group in their answers on the topics of the pre and post cognitive achievement tests of the subject of "principles of scientific Research " . The two researchers have adopted the experimental approach which consists of one group . the population of the r
... Show MoreTwitter data analysis is an emerging field of research that utilizes data collected from Twitter to address many issues such as disaster response, sentiment analysis, and demographic studies. The success of data analysis relies on collecting accurate and representative data of the studied group or phenomena to get the best results. Various twitter analysis applications rely on collecting the locations of the users sending the tweets, but this information is not always available. There are several attempts at estimating location based aspects of a tweet. However, there is a lack of attempts on investigating the data collection methods that are focused on location. In this paper, we investigate the two methods for obtaining location-based dat
... Show More