In this review paper, several research studies were surveyed to assist future researchers to identify available techniques in the field of infectious disease modeling across complex networks. Infectious disease modelling is becoming increasingly important because of the microbes and viruses that threaten people’s lives and societies in all respects. It has long been a focus of research in many domains, including mathematical biology, physics, computer science, engineering, economics, and the social sciences, to properly represent and analyze spreading processes. This survey first presents a brief overview of previous literature and some graphs and equations to clarify the modeling in complex networks, the detection of societies and their medical information, the identification of nodes, the method of communication with individuals and their spread, the analysis of their transmission through complex networks, and the detection of mathematical methods over the past century. Secondly, the types of epidemiological models and complex networks and the extent of their impact on humans are presented.
Background: Prosthodontic services have changed markedly due to an introduction of new materials, techniques and treatment options. The aim of this study were to identify the type of materials and the methods used by dental practitioners in their clinics to construct conventional complete dentures and to specify the type and design for removable partial dentures (RPDs); and to then compare them with those taught in dental schools. Materials and methods: A total of 153 dental practitioners in Sulaimani city completed a written questionnaire. The questionnaire included 19 questions regarding complete and RPDs fabrication. Results: Most of the practitioners provide complete dentures (81.6%) and RPDs (95.3%) in their clinics. Polyvinyl silox
... Show MoreArtificial intelligence techniques are reaching us in several forms, some of which are useful but can be exploited in a way that harms us. One of these forms is called deepfakes. Deepfakes is used to completely modify video (or image) content to display something that was not in it originally. The danger of deepfake technology impact on society through the loss of confidence in everything is published. Therefore, in this paper, we focus on deepfakedetection technology from the view of two concepts which are deep learning and forensic tools. The purpose of this survey is to give the reader a deeper overview of i) the environment of deepfake creation and detection, ii) how deep learning and forensic tools contributed to the detection
... Show MoreText categorization refers to the process of grouping text or documents into classes or categories according to their content. Text categorization process consists of three phases which are: preprocessing, feature extraction and classification. In comparison to the English language, just few studies have been done to categorize and classify the Arabic language. For a variety of applications, such as text classification and clustering, Arabic text representation is a difficult task because Arabic language is noted for its richness, diversity, and complicated morphology. This paper presents a comprehensive analysis and a comparison for researchers in the last five years based on the dataset, year, algorithms and the accuracy th
... Show MoreThis study aims at shedding light on the linguistic significance of collocation networks in the academic writing context. Following Firth’s principle “You shall know a word by the company it keeps.” The study intends to examine three selected nodes (i.e. research, study, and paper) shared collocations in an academic context. This is achieved by using the corpus linguistic tool; GraphColl in #LancsBox software version 5 which was announced in June 2020 in analyzing selected nodes. The study focuses on academic writing of two corpora which were designed and collected especially to serve the purpose of the study. The corpora consist of a collection of abstracts extracted from two different academic journals that publish for writ
... Show MoreThe advancement of digital technology has increased the deployment of wireless sensor networks (WSNs) in our daily life. However, locating sensor nodes is a challenging task in WSNs. Sensing data without an accurate location is worthless, especially in critical applications. The pioneering technique in range-free localization schemes is a sequential Monte Carlo (SMC) method, which utilizes network connectivity to estimate sensor location without additional hardware. This study presents a comprehensive survey of state-of-the-art SMC localization schemes. We present the schemes as a thematic taxonomy of localization operation in SMC. Moreover, the critical characteristics of each existing scheme are analyzed to identify its advantages
... Show MoreIn the present paper, an eco-epidemiological model consisting of diseased prey consumed by a predator with fear cost, and hunting cooperation property is formulated and studied. It is assumed that the predator doesn’t distinguish between the healthy prey and sick prey and hence it consumed both. The solution’s properties such as existence, uniqueness, positivity, and bounded are discussed. The existence and stability conditions of all possible equilibrium points are studied. The persistence requirements of the proposed system are established. The bifurcation analysis near the non-hyperbolic equilibrium points is investigated. Numerically, some simulations are carried out to validate the main findings and obtain the critical values of th
... Show MoreThe objective of the conventional well testing technique is to evaluate well- reservoir interaction through determining the flow capacity and well potential on a short-term basis by relying on the transient pressure response methodology. The well testing analysis is a major input to the reservoir simulation model to validate the near wellbore characteristics and update the variables that are normally function of time such as skin, permeability and productivity multipliers.
Well test analysis models are normally built on analytical approaches with fundamental physical of homogenous media with line source solution. Many developments in the last decade were made to increase the resolution of transient response derivation to meet the
... Show More