The agent-based modeling is currently utilized extensively to analyze complex systems. It supported such growth, because it was able to convey distinct levels of interaction in a complex detailed environment. Meanwhile, agent-based models incline to be progressively complex. Thus, powerful modeling and simulation techniques are needed to address this rise in complexity. In recent years, a number of platforms for developing agent-based models have been developed. Actually, in most of the agents, often discrete representation of the environment, and one level of interaction are presented, where two or three are regarded hardly in various agent-based models. The key issue is that modellers work in these areas is not assisted by simulation plat
... Show MorePerformance issues could be appearing from anywhere in a computer system, finding the root cause of those issues is a troublesome issue due to the complexity of the modern systems and applications. Microsoft builds multiple mechanisms to make their engineers understand what is happening inside All Windows versions including Windows 10 Home and the behavior of any application working on it whether Microsoft services or even third-party applications, one of those mechanisms is the Event Tracing for Windows (ETW) which is the core of logging and tracing in Windows operating system to trace the internal events of the system and its applications. This study goes deep into internal process activities to investigat
... Show MoreStructure of network, which is known as community detection in networks, has received a great attention in diverse topics, including social sciences, biological studies, politics, etc. There are a large number of studies and practical approaches that were designed to solve the problem of finding the structure of the network. The definition of complex network model based on clustering is a non-deterministic polynomial-time hardness (NP-hard) problem. There are no ideal techniques to define the clustering. Here, we present a statistical approach based on using the likelihood function of a Stochastic Block Model (SBM). The objective is to define the general model and select the best model with high quality. Therefor
... Show MoreFuzzy C-means (FCM) is a clustering method used for collecting similar data elements within the group according to specific measurements. Tabu is a heuristic algorithm. In this paper, Probabilistic Tabu Search for FCM implemented to find a global clustering based on the minimum value of the Fuzzy objective function. The experiments designed for different networks, and cluster’s number the results show the best performance based on the comparison that is done between the values of the objective function in the case of using standard FCM and Tabu-FCM, for the average of ten runs.
Information is an essential and valuable object in all systems. The more information you have about your issue, the better you can conform to the world around you. Moreover, information recognizes companies and provides influence that helps one company be more effective than another. So, protecting this information using better security controls and providing a high level of access to authorized parties becomes an urgent need. As a result, many algorithms and encryption techniques have been developed to provide a high level of protection for system information. Therefore, this paper presents an enhancement to the Blowfish algorithm as one of the cryptography techniques. Then it proposes an enhancement for increasing efficiency
... Show MoreThe energy backpropagation algorithm (EBP) used the net, which contains two
nodes of input layer, hidden layer and output layer. In this paper, we will use a net,
which contains three nodes and four nodes of input layer, hidden layer and output
layer. This study will compares among times of the learning, times of the
identification and times of the converging by using the three nets (2, 3 and 4 nodes)
in the Energy back propagation Algorithm. The results of experiments show the nets
which contain three nodes and four nodes have better performance for time of the
learning than the net which contains two nodes, while the net which contains two
nodes has better performance for the time of identificati
In this study, the mobile phone traces concern an ephemeral event which represents important densities of people. This research aims to study city pulse and human mobility evolution that would be arise during specific event (Armada festival), by modelling and simulating human mobility of the observed region, depending on CDRs (Call Detail Records) data. The most pivot questions of this research are: Why human mobility studied? What are the human life patterns in the observed region inside Rouen city during Armada festival? How life patterns and individuals' mobility could be extracted for this region from mobile DB (CDRs)? The radius of gyration parameter has been applied to elaborate human life patterns with regards to (work, off) days for
... Show MoreThe No Mobile Phone Phobia or Nomophobia notion is referred to the psychological condition once humans have a fear of being disconnected from mobile phone connectivity. Hence, it is considered as a recent age phobia that emerged nowadays as a consequence of high engagement between people, mobile data, and communication inventions, especially the smart phones. This review is based on earlier observations and current debate such as commonly used techniques that modeling and analyzing this phenomenon like statistical studies. All that in order to possess preferable comprehension concerning human reactions to the speedy technological ubiquitous. Accordingly, humans ought to restrict their utilization of mobile phones instead of prohibit
... Show MoreThe density-based spatial clustering for applications with noise (DBSCAN) is one of the most popular applications of clustering in data mining, and it is used to identify useful patterns and interesting distributions in the underlying data. Aggregation methods for classifying nonlinear aggregated data. In particular, DNA methylations, gene expression. That show the differentially skewed by distance sites and grouped nonlinearly by cancer daisies and the change Situations for gene excretion on it. Under these conditions, DBSCAN is expected to have a desirable clustering feature i that can be used to show the results of the changes. This research reviews the DBSCAN and compares its performance with other algorithms, such as the tradit
... Show MoreThe continuous advancement in the use of the IoT has greatly transformed industries, though at the same time it has made the IoT network vulnerable to highly advanced cybercrimes. There are several limitations with traditional security measures for IoT; the protection of distributed and adaptive IoT systems requires new approaches. This research presents novel threat intelligence for IoT networks based on deep learning, which maintains compliance with IEEE standards. Interweaving artificial intelligence with standardization frameworks is the goal of the study and, thus, improves the identification, protection, and reduction of cyber threats impacting IoT environments. The study is systematic and begins by examining IoT-specific thre
... Show MoreIn this note, we present a component-wise algorithm combining several recent ideas from signal processing for simultaneous piecewise constants trend, seasonality, outliers, and noise decomposition of dynamical time series. Our approach is entirely based on convex optimisation, and our decomposition is guaranteed to be a global optimiser. We demonstrate the efficiency of the approach via simulations results and real data analysis.
Groupwise non-rigid image alignment is a difficult non-linear optimization problem involving many parameters and often large datasets. Previous methods have explored various metrics and optimization strategies. Good results have been previously achieved with simple metrics, requiring complex optimization, often with many unintuitive parameters that require careful tuning for each dataset. In this chapter, the problem is restructured to use a simpler, iterative optimization algorithm, with very few free parameters. The warps are refined using an iterative Levenberg-Marquardt minimization to the mean, based on updating the locations of a small number of points and incorporating a stiffness constraint. This optimization approach is eff
... Show MorePilots are trained using computerized flight simulators. A flight simulator is a training system where pilots can acquire flying skills without need to practice on a real airplane. Simulators are used by professional pilots to practice flying strategies under emergency or hazardous conditions, or to train on new aircraft types. In this study a framework for flight simulation is presented and the layout of an implemented program is described. The calculations were based on simple theoretical approach. The implementation was based on utilizing some of utilities supported by ActiveX, DirectX and OpenGL written in Visual C++. The main design consideration is to build a simple flight simulation program can operate without need to high computer e
... Show MoreThe communication networks (mobile phone networks, social media platforms) produce digital traces from their usages. This type of information help to understand and analyze the human mobility in very accurate way. By these analyzes over cities, it can give powerful data on daily citizen activities, urban planners have in that way, relevant indications for decision making on design and development. As well as, the Call detail Records (CDRs) provides valuable spatiotemporal data at the level of citywide or even nationwide. The CDRs could be analyzed to extract the life patterns and individuals mobility in an observed urban area and during ephemeral events. Whereas, their analysis gives conceptual views about human density and mobility pattern
... Show MoreData mining has the most important role in healthcare for discovering hidden relationships in big datasets, especially in breast cancer diagnostics, which is the most popular cause of death in the world. In this paper two algorithms are applied that are decision tree and K-Nearest Neighbour for diagnosing Breast Cancer Grad in order to reduce its risk on patients. In decision tree with feature selection, the Gini index gives an accuracy of %87.83, while with entropy, the feature selection gives an accuracy of %86.77. In both cases, Age appeared as the most effective parameter, particularly when Age<49.5. Whereas Ki67 appeared as a second effective parameter. Furthermore, K- Nearest Neighbor is based on the minimu
... Show MoreInterface evaluation has been the subject of extensive study and research in human-computer interaction (HCI). It is a crucial tool for promoting the idea that user engagement with computers should resemble casual conversations and interactions between individuals, according to specialists in the field. Researchers in the HCI field initially focused on making various computer interfaces more usable, thus improving the user experience. This study's objectives were to evaluate and enhance the user interface of the University of Baghdad's implementation of an online academic management system using the effectiveness, time-based efficiency, and satisfaction rates that comply with the task questionnaire process. We made a variety of interfaces f
... Show MoreThis paper uses Artificial Intelligence (AI) based algorithm analysis to classify breast cancer Deoxyribonucleic (DNA). Main idea is to focus on application of machine and deep learning techniques. Furthermore, a genetic algorithm is used to diagnose gene expression to reduce the number of misclassified cancers. After patients' genetic data are entered, processing operations that require filling the missing values using different techniques are used. The best data for the classification process are chosen by combining each technique using the genetic algorithm and comparing them in terms of accuracy.
Investigating the human mobility patterns is a highly interesting field in the 21th century, and it takes vast attention from multi-disciplinary scientists in physics, economic, social, computer, engineering…etc. depending on the concept that relates between human mobility patterns and their communications. Hence, the necessity for a rich repository of data has emerged. Therefore, the most powerful solution is the usage of GSM network data, which gives millions of Call Details Records gained from urban regions. However, the available data still have shortcomings, because it gives only the indication of spatio-temporal data at only the moment of mobile communication activities. In th