Nowadays, the mobile communication networks have become a consistent part of our everyday life by transforming huge amount of data through communicating devices, that leads to new challenges. According to the Cisco Networking Index, more than 29.3 billion networked devices will be connected to the network during the year 2023. It is obvious that the existing infrastructures in current networks will not be able to support all the generated data due to the bandwidth limits, processing and transmission overhead. To cope with these issues, future mobile communication networks must achieve high requirements to reduce the amount of transferred data, decrease latency and computation costs. One of the essential challenging tasks in this subject area is the optimal self-organized service placement. In this paper a heuristic-based algorithm for service placement in future networks was presented. This algorithm achieves the ideal placement of services replicas by monitoring the load within the server and its neighborhood, choosing the node that contributes with the highest received load, and finally replicating or migrating the service to it based on specific criteria, so that the distance of requests coming from clients becomes as small as possible because of placing services within nearby locations. It was proved that our proposed algorithm achieves an improved performance by meeting the services within a shorter time, a smaller bandwidth, and thus a lower communication cost. It was compared with the traditional client-server approach and the random placement algorithm. Experimental results showed that the heuristic algorithm outperforms other approaches and meets the optimal performance with different network sizes and varying load scenarios.
Krawtchouk polynomials (KPs) and their moments are promising techniques for applications of information theory, coding theory, and signal processing. This is due to the special capabilities of KPs in feature extraction and classification processes. The main challenge in existing KPs recurrence algorithms is that of numerical errors, which occur during the computation of the coefficients in large polynomial sizes, particularly when the KP parameter (p) values deviate away from 0.5 to 0 and 1. To this end, this paper proposes a new recurrence relation in order to compute the coefficients of KPs in high orders. In particular, this paper discusses the development of a new algorithm and presents a new mathematical model for computing the
... Show MoreIn many video and image processing applications, the frames are partitioned into blocks, which are extracted and processed sequentially. In this paper, we propose a fast algorithm for calculation of features of overlapping image blocks. We assume the features are projections of the block on separable 2D basis functions (usually orthogonal polynomials) where we benefit from the symmetry with respect to spatial variables. The main idea is based on a construction of auxiliary matrices that virtually extends the original image and makes it possible to avoid a time-consuming computation in loops. These matrices can be pre-calculated, stored and used repeatedly since they are independent of the image itself. We validated experimentally th
... Show MoreIn this research a new system identification algorithm is presented for obtaining an optimal set of mathematical models for system with perturbed coefficients, then this algorithm is applied practically by an “On Line System Identification Circuit”, based on real time speed response data of a permanent magnet DC motor. Such set of mathematical models represents the physical plant against all variation which may exist in its parameters, and forms a strong mathematical foundation for stability and performance analysis in control theory problems.
This paper proposes a new method Object Detection in Skin Cancer Image, the minimum
spanning tree Detection descriptor (MST). This ObjectDetection descriptor builds on the
structure of the minimum spanning tree constructed on the targettraining set of Skin Cancer
Images only. The Skin Cancer Image Detection of test objects relies on their distances to the
closest edge of thattree. Our experimentsshow that the Minimum Spanning Tree (MST) performs
especially well in case of Fogginessimage problems and in highNoisespaces for Skin Cancer
Image.
The proposed method of Object Detection Skin Cancer Image wasimplemented and tested on
different Skin Cancer Images. We obtained very good results . The experiment showed that
The current research aims to know the complementary use of the paper book and the digital book by defining the paper book and the digital book and knowing the methods and methods of obtaining each of them and indicating the extent of their use and for what purpose and knowing the difficulties that researchers face in using each of them and which is better using the paper book or the digital book in preparing research Studies and for no reason. The research relied on the descriptive approach and the questionnaire as a tool for data collection, and it was distributed to professors and students of Iraqi universities. The research sample reached (219) individuals and reached a set of results, the most important of which are: Preference for the
... Show MoreManual probing and periodontal charting are the gold standard for periodontal diagnosis that have been used in practice over a century. These methods are affordable and reliable but they are associated with some drawbacks that cannot be avoided. Among these issues is their reliance on operator’s skills, time-consuming and tedious procedure, lack sensitivity especially in cases of early bone loss, and causing discomfort to the patient. Availability of a wide range of biomarkers in the oral biofluids, dental biofilm, and tissues that potentially reflect the periodontal health and disease accurately encouraged their use as predictive/diagnostic/monitoring tools. Analysing biomarkers during care-giving to the patient using chairside kits i
... Show MoreThe rapid and enormous growth of the Internet of Things, as well as its widespread adoption, has resulted in the production of massive quantities of data that must be processed and sent to the cloud, but the delay in processing the data and the time it takes to send it to the cloud has resulted in the emergence of fog, a new generation of cloud in which the fog serves as an extension of cloud services at the edge of the network, reducing latency and traffic. The distribution of computational resources to minimize makespan and running costs is one of the disadvantages of fog computing. This paper provides a new approach for improving the task scheduling problem in a Cloud-Fog environme
Abstract
This research aims to measure the effect of Self competency of the Managers in their behavior from the view point of the working individual in the organization since the behavior of managers is considered to be one of the essential variables in the organization which can affect the performance and the commitment of the working individual. the questioners was used to gather the data and the Iraqi Rail Road co. was the field of the study . and a random sample of (36) individual of the subordinates of the managers society of the study and used the (SPSS) statistical program was used in the analysis of the data of the research . the findings refer to the existence of a
... Show MoreConstruction contractors usually undertake multiple construction projects simultaneously. Such a situation involves sharing different types of resources, including monetary, equipment, and manpower, which may become a major challenge in many cases. In this study, the financial aspects of working on multiple projects at a time are addressed and investigated. The study considers dealing with financial shortages by proposing a multi-project scheduling optimization model for profit maximization, while minimizing the total project duration. Optimization genetic algorithm and finance-based scheduling are used to produce feasible schedules that balance the finance of activities at any time w