Finding the shortest route in wireless mesh networks is an important aspect. Many techniques are used to solve this problem like dynamic programming, evolutionary algorithms, weighted-sum techniques, and others. In this paper, we use dynamic programming techniques to find the shortest path in wireless mesh networks due to their generality, reduction of complexity and facilitation of numerical computation, simplicity in incorporating constraints, and their onformity to the stochastic nature of some problems. The routing problem is a multi-objective optimization problem with some constraints such as path capacity and end-to-end delay. Single-constraint routing problems and solutions using Dijkstra, Bellman-Ford, and Floyd-Warshall algorithms are proposed in this work with a discussion on the difference between them. These algorithms find the shortest route through finding the optimal rate between two nodes in the wireless networks but with bounded end-to-end delay. The Dijkstra-based algorithm is especially favorable in terms of processing time. We also present a comparison between our proposed single-constraint Dijkstra-based routing algorithm and the mesh routing algorithm (MRA) existing in the literature to clarify the merits of the former.
Find cares studying ways in the development of industrial products and designs: the way the progressive development (how typical) and root development (jump design), was the aim of the research: to determine the effectiveness of the pattern and the jump in the development of designs and industrial products. After a process of analysis of a sample of research and two models of contemporary household electrical appliances, it was reached a set of findings and conclusions including:1-leaping designs changed a lot of entrenched perceptions of the user on how the product works and its use and the size and shape of the product, revealing him about the possibilities of sophisticated relationships with the product, while keeping the typical desi
... Show MorePlagiarism is becoming more of a problem in academics. It’s made worse by the ease with which a wide range of resources can be found on the internet, as well as the ease with which they can be copied and pasted. It is academic theft since the perpetrator has ”taken” and presented the work of others as his or her own. Manual detection of plagiarism by a human being is difficult, imprecise, and time-consuming because it is difficult for anyone to compare their work to current data. Plagiarism is a big problem in higher education, and it can happen on any topic. Plagiarism detection has been studied in many scientific articles, and methods for recognition have been created utilizing the Plagiarism analysis, Authorship identification, and
... Show MoreContemporary life is racing against time in its temptations and variables, and it has become shaped and changed in an amazing way in its various aspects and fields. This was facilitated by intellectual and scientific communication between civilizations, and the rapid progression in successive inventions and discoveries in the fields of science and arts of knowledge. This contributed to a great economic and commercial renaissance. Then, these economic developments entered the world into a very strong competition, which forced producers to calculate all production costs, to reach the highest profits by reducing the price of the produced commodity on the one hand, and achieving quality in appearance (especially) on the other hand. Since the ma
... Show MoreIn this review paper, several research studies were surveyed to assist future researchers to identify available techniques in the field of infectious disease modeling across complex networks. Infectious disease modelling is becoming increasingly important because of the microbes and viruses that threaten people’s lives and societies in all respects. It has long been a focus of research in many domains, including mathematical biology, physics, computer science, engineering, economics, and the social sciences, to properly represent and analyze spreading processes. This survey first presents a brief overview of previous literature and some graphs and equations to clarify the modeling in complex networks, the detection of soc
... Show MoreIn this review paper, several research studies were surveyed to assist future researchers to identify available techniques in the field of infectious disease modeling across complex networks. Infectious disease modelling is becoming increasingly important because of the microbes and viruses that threaten people’s lives and societies in all respects. It has long been a focus of research in many domains, including mathematical biology, physics, computer science, engineering, economics, and the social sciences, to properly represent and analyze spreading processes. This survey first presents a brief overview of previous literature and some graphs and equations to clarify the modeling in complex networks, the detection of societie
... Show MoreIn this review paper, several research studies were surveyed to assist future researchers to identify available techniques in the field of infectious disease modeling across complex networks. Infectious disease modelling is becoming increasingly important because of the microbes and viruses that threaten people’s lives and societies in all respects. It has long been a focus of research in many domains, including mathematical biology, physics, computer science, engineering, economics, and the social sciences, to properly represent and analyze spreading processes. This survey first presents a brief overview of previous literature and some graphs and equations to clarify the modeling in complex networks, the detection of soc
... Show MoreThe aim of this paper is to compare between classical and fuzzy filters for removing different types of noise in gray scale images. The processing used consists of three steps. First, different types of noise are added to the original image to produce a noisy image (with different noise ratios). Second, classical and fuzzy filters are used to filter the noisy image. Finally, comparing between resulting images depending on a quantitative measure called Peak Signal-to-Noise Ratio (PSNR) to determine the best filter in each case.
The image used in this paper is a 512 * 512 pixel and the size of all filters is a square window of size 3*3. Results indicate that fuzzy filters achieve varying successes in noise reduction in image compared to
Getting knowledge from raw data has delivered beneficial information in several domains. The prevalent utilizing of social media produced extraordinary quantities of social information. Simply, social media delivers an available podium for employers for sharing information. Data Mining has ability to present applicable designs that can be useful for employers, commercial, and customers. Data of social media are strident, massive, formless, and dynamic in the natural case, so modern encounters grow. Investigation methods of data mining utilized via social networks is the purpose of the study, accepting investigation plans on the basis of criteria, and by selecting a number of papers to serve as the foundation for this arti
... Show MoreAcute appendicitis is one of the commonest causes of acute abdomen. There is a wide discussion and controversy on the surgical and nonsurgical treatment of acute uncomplicated appendicitis. The aim of this study was to evaluate the efficacy and outcomes of the conservative management of selected cases of acute appendicitis with an antibiotic first plan.
This was a single hospital-based prospective study with a durat