The rise of edge-cloud continuum computing is a result of the growing significance of edge computing, which has become a complementary or substitute option for traditional cloud services. The convergence of networking and computers presents a notable challenge due to their distinct historical development. Task scheduling is a major challenge in the context of edge-cloud continuum computing. The selection of the execution location of tasks, is crucial in meeting the quality-of-service (QoS) requirements of applications. An efficient scheduling strategy for distributing workloads among virtual machines in the edge-cloud continuum data center is mandatory to ensure the fulfilment of QoS requirements for both customer and service provider. Existing research used metaheuristic algorithm to solve tak scheduling problem, however, must of the existing metaheuristics used suffers from falling into local mina due to their inefficiency to avoid unfeasible region in the solution search space. Therefore, there is a dire need for an efficient metaheuristic algorithm for task scheduling. This study proposed an FPA-ISFLA task scheduling model using hybrid flower pollination and improved shuffled frog leaping algorithms. The simulation results indicate that the FPA-ISFLA algorithm is superior to the PSO algorithm in terms of makespan time, resource utilization, and execution cost reduction, especially with an increasing number of tasks.
One of the most important , compound which have active hydrogen is the compound possessing (thiol group) Biphenyl-4,4-dithiol is agood example utilized in a wide field for preparation mannich bases , avariety of new acetylenic mannich bases have been Synthesized and all proposed structure were Supported by FTIR , 1H – NMR, 13C-NMR , Elemental analysis and microbial study .
This paper proposed a new method for network self-fault management (NSFM) based on two technologies: intelligent agent to automate fault management tasks, and Windows Management Instrumentations (WMI) to identify the fault faster when resources are independent (different type of devices). The proposed network self-fault management reduced the load of network traffic by reducing the request and response between the server and client, which achieves less downtime for each node in state of fault occurring in the client. The performance of the proposed system is measured by three measures: efficiency, availability, and reliability. A high efficiency average is obtained depending on the faults occurred in the system which reaches to
... Show MoreThe primary objective of this paper is to improve a biometric authentication and classification model using the ear as a distinct part of the face since it is unchanged with time and unaffected by facial expressions. The proposed model is a new scenario for enhancing ear recognition accuracy via modifying the AdaBoost algorithm to optimize adaptive learning. To overcome the limitation of image illumination, occlusion, and problems of image registration, the Scale-invariant feature transform technique was used to extract features. Various consecutive phases were used to improve classification accuracy. These phases are image acquisition, preprocessing, filtering, smoothing, and feature extraction. To assess the proposed
... Show MoreConstruction contractors usually undertake multiple construction projects simultaneously. Such a situation involves sharing different types of resources, including monetary, equipment, and manpower, which may become a major challenge in many cases. In this study, the financial aspects of working on multiple projects at a time are addressed and investigated. The study considers dealing with financial shortages by proposing a multi-project scheduling optimization model for profit maximization, while minimizing the total project duration. Optimization genetic algorithm and finance-based scheduling are used to produce feasible schedules that balance the finance of activities at any time w
This work aims to see the positive association rules and negative association rules in the Apriori algorithm by using cosine correlation analysis. The default and the modified Association Rule Mining algorithm are implemented against the mushroom database to find out the difference of the results. The experimental results showed that the modified Association Rule Mining algorithm could generate negative association rules. The addition of cosine correlation analysis returns a smaller amount of association rules than the amounts of the default Association Rule Mining algorithm. From the top ten association rules, it can be seen that there are different rules between the default and the modified Apriori algorithm. The difference of the obta
... Show MoreComputer systems and networks are being used in almost every aspect of our daily life; as a result the security threats to computers and networks have also increased significantly. Traditionally, password-based user authentication is widely used to authenticate legitimate user in the current system0T but0T this method has many loop holes such as password sharing, shoulder surfing, brute force attack, dictionary attack, guessing, phishing and many more. The aim of this paper is to enhance the password authentication method by presenting a keystroke dynamics with back propagation neural network as a transparent layer of user authentication. Keystroke Dynamics is one of the famous and inexpensive behavioral biometric technologies, which identi
... Show MoreNS-2 is a tool to simulate networks and events that occur per packet sequentially based on time and are widely used in the research field. NS-2 comes with NAM (Network Animator) that produces a visual representation it also supports several simulation protocols. The network can be tested end-to-end. This test includes data transmission, delay, jitter, packet-loss ratio and throughput. The Performance Analysis simulates a virtual network and tests for transport layer protocols at the same time with variable data and analyzes simulation results based on the network simulator NS-2.
In this work we present a technique to extract the heart contours from noisy echocardiograph images. Our technique is based on improving the image before applying contours detection to reduce heavy noise and get better image quality. To perform that, we combine many pre-processing techniques (filtering, morphological operations, and contrast adjustment) to avoid unclear edges and enhance low contrast of echocardiograph images, after implementing these techniques we can get legible detection for heart boundaries and valves movement by traditional edge detection methods.