Structure of network, which is known as community detection in networks, has received a great attention in diverse topics, including social sciences, biological studies, politics, etc. There are a large number of studies and practical approaches that were designed to solve the problem of finding the structure of the network. The definition of complex network model based on clustering is a non-deterministic polynomial-time hardness (NP-hard) problem. There are no ideal techniques to define the clustering. Here, we present a statistical approach based on using the likelihood function of a Stochastic Block Model (SBM). The objective is to define the general model and select the best model with high quality. Therefore, integrating the Tabu Search method with Fuzzy c-Mean (FCM) is implemented in different settings. The experiments are designed to find the best structure for different types of networks by maximizing the objective functions. SBM selections are computed by applying two types of criteria, namely Akaike Information Criteria (AIC) and Bayesian Information Criteria (BIC). The results show the ability of the proposed method to find the best community of the given networks.
Image compression has become one of the most important applications of the image processing field because of the rapid growth in computer power. The corresponding growth in the multimedia market, and the advent of the World Wide Web, which makes the internet easily accessible for everyone. Since the early 1980, digital image sequence processing has been an attractive research area because an image sequence, as acollection of images, may provide much compression than a single image frame. The increased computational complexity and memory space required for image sequence processing, has in fact, becoming more attainable. this research absolute Moment Block Truncation compression technique which is depend on adopting the good points of oth
... Show MoreThis paper investigates the capacitated vehicle routing problem (CVRP) as it is one of the numerous issues that have no impeccable solutions yet. Numerous scientists in the recent couple of decades have set up various explores and utilized numerous strategies with various methods to deal with it. However, for all researches, finding the least cost is exceptionally complicated. In any case, they have figured out how to think of rough solutions that vary in efficiencies relying upon the search space. Furthermore, tabu search (TS) is utilized to resolve this issue as it is fit for solving numerous complicated issues. The algorithm has been adjusted to resolve the exploration issue, where its methodology is not quite the same as the normal a
... Show MoreThe main problem when dealing with fuzzy data variables is that it cannot be formed by a model that represents the data through the method of Fuzzy Least Squares Estimator (FLSE) which gives false estimates of the invalidity of the method in the case of the existence of the problem of multicollinearity. To overcome this problem, the Fuzzy Bridge Regression Estimator (FBRE) Method was relied upon to estimate a fuzzy linear regression model by triangular fuzzy numbers. Moreover, the detection of the problem of multicollinearity in the fuzzy data can be done by using Variance Inflation Factor when the inputs variable of the model crisp, output variable, and parameters are fuzzed. The results were compared usin
... Show MoreFuzzy C-means (FCM) is a clustering method used for collecting similar data elements within the group according to specific measurements. Tabu is a heuristic algorithm. In this paper, Probabilistic Tabu Search for FCM implemented to find a global clustering based on the minimum value of the Fuzzy objective function. The experiments designed for different networks, and cluster’s number the results show the best performance based on the comparison that is done between the values of the objective function in the case of using standard FCM and Tabu-FCM, for the average of ten runs.
Finding communities of connected individuals in complex networks is challenging, yet crucial for understanding different real-world societies and their interactions. Recently attention has turned to discover the dynamics of such communities. However, detecting accurate community structures that evolve over time adds additional challenges. Almost all the state-of-the-art algorithms are designed based on seemingly the same principle while treating the problem as a coupled optimization model to simultaneously identify community structures and their evolution over time. Unlike all these studies, the current work aims to individually consider this three measures, i.e. intra-community score, inter-community score, and evolution of community over
... Show MoreCommunity detection is an important and interesting topic for better understanding and analyzing complex network structures. Detecting hidden partitions in complex networks is proven to be an NP-hard problem that may not be accurately resolved using traditional methods. So it is solved using evolutionary computation methods and modeled in the literature as an optimization problem. In recent years, many researchers have directed their research efforts toward addressing the problem of community structure detection by developing different algorithms and making use of single-objective optimization methods. In this study, we have continued that research line by improving the Particle Swarm Optimization (PSO) algorithm using a local
... Show MoreHome Computer and Information Science 2009 Chapter The Stochastic Network Calculus Methodology Deah J. Kadhim, Saba Q. Jobbar, Wei Liu & Wenqing Cheng Chapter 568 Accesses 1 Citations Part of the Studies in Computational Intelligence book series (SCI,volume 208) Abstract The stochastic network calculus is an evolving new methodology for backlog and delay analysis of networks that can account for statistical multiplexing gain. This paper advances the stochastic network calculus by deriving a network service curve, which expresses the service given to a flow by the network as a whole in terms of a probabilistic bound. The presented network service curve permits the calculation of statistical end-to-end delay and backlog bounds for broad
... Show MoreMultilevel models are among the most important models widely used in the application and analysis of data that are characterized by the fact that observations take a hierarchical form, In our research we examined the multilevel logistic regression model (intercept random and slope random model) , here the importance of the research highlights that the usual regression models calculate the total variance of the model and its inability to read variance and variations between levels ,however in the case of multi-level regression models, the calculation of the total variance is inaccurate and therefore these models calculate the variations for each level of the model, Where the research aims to estimate the parameters of this m
... Show MoreGlaucoma is a visual disorder, which is one of the significant driving reason for visual impairment. Glaucoma leads to frustrate the visual information transmission to the brain. Dissimilar to other eye illness such as myopia and cataracts. The impact of glaucoma can’t be cured; The Disc Damage Likelihood Scale (DDLS) can be used to assess the Glaucoma. The proposed methodology suggested simple method to extract Neuroretinal rim (NRM) region then dividing the region into four sectors after that calculate the width for each sector and select the minimum value to use it in DDLS factor. The feature was fed to the SVM classification algorithm, the DDLS successfully classified Glaucoma d