The rapid and enormous growth of the Internet of Things, as well as its widespread adoption, has resulted in the production of massive quantities of data that must be processed and sent to the cloud, but the delay in processing the data and the time it takes to send it to the cloud has resulted in the emergence of fog, a new generation of cloud in which the fog serves as an extension of cloud services at the edge of the network, reducing latency and traffic. The distribution of computational resources to minimize makespan and running costs is one of the disadvantages of fog computing. This paper provides a new approach for improving the task scheduling problem in a Cloud-Fog environment in terms of execution time(makespan) and operating costs for Bag-of-Tasks applications. A task scheduling evolutionary algorithm has been proposed. A single custom representation of the problem and a uniform intersection are built for the proposed algorithm. Furthermore, the individual initialization and perturbation operators (crossover and mutation) were created to resolve the inapplicability of any solution found or reached by the proposed evolutionary algorithm. The proposed ETS (Evolutionary Task Scheduling algorithm) algorithm was evaluated on 11 datasets of varying size in a number of tasks. The ETS outperformed the Bee Life (BLA), Modified Particle Swarm (MPSO), and RR algorithms in terms of Makespan and operating costs, according to the results of the experiments.
The urban Gentrification is an inclusive global phenomenon to restructure the cities on the overall levels, the research to propose a specific study about the concept of urban Gentrification in the cities and showcasing its, specifications, and results, and how to deal with the variables that occur on cities through improvements as part of urban renewal projects, then the general axis of the research is shrinked, choosing the urban centers as the most important areas that deal with the urban Gentrification process due to its direct connection with indivisuals and social changes, and to process the specific axis of the research theses and studies will be showcased that discuss the topic in different research directions, and emerged
... Show MoreThe goal of this paper is to design a robust controller for controlling a pendulum
system. The control of nonlinear systems is a common problem that is facing the researchers in control systems design. The Sliding Mode Controller (SMC) is the best solution for controlling a nonlinear system. The classical SMC consists from two phases. The first phase is the reaching phase and the second is the sliding phase. The SMC suffers from the chattering phenomenon which is considered as a severe problem and undesirable property. It is a zigzag motion along the switching surface. In this paper, the chattering is reduced by using a saturation function instead of sign function. In spite of SMC is a good method for controlling a nonlinear system b
In this paper, an Integral Backstepping Controller (IBC) is designed and optimized for full control, of rotational and translational dynamics, of an unmanned Quadcopter (QC). Before designing the controller, a mathematical model for the QC is developed in a form appropriate for the IBC design. Due to the underactuated property of the QC, it is possible to control the QC Cartesian positions (X, Y, and Z) and the yaw angle through ordering the desired values for them. As for the pitch and roll angles, they are generated by the position controllers. Backstepping Controller (BC) is a practical nonlinear control scheme based on Lyapunov design approach, which can, therefore, guarantee the convergence of the position tracking
... Show MoreMalware represents one of the dangerous threats to computer security. Dynamic analysis has difficulties in detecting unknown malware. This paper developed an integrated multi – layer detection approach to provide more accuracy in detecting malware. User interface integrated with Virus Total was designed as a first layer which represented a warning system for malware infection, Malware data base within malware samples as a second layer, Cuckoo as a third layer, Bull guard as a fourth layer and IDA pro as a fifth layer. The results showed that the use of fifth layers was better than the use of a single detector without merging. For example, the efficiency of the proposed approach is 100% compared with 18% and 63% of Virus Total and Bel
... Show MoreProject management are still depending on manual exchange of information based on paper documents. Where design drawings drafting by computer-aided design (CAD), but the data needed by project management software can not be extracted directly from CAD, and must be manually entered by the user. The process of calculation and collection of information from drawings and enter in the project management software needs effort and time with the possibility of errors in the transfer and enter of information. This research presents an integrated computer system for building projects where the extraction and import quantities, through the interpretation of AutoCAD drawing with MS Access database of unit costs and productivities for the pricing and
... Show MoreThe paper generates a geological model of a giant Middle East oil reservoir, the model constructed based on the field data of 161 wells. The main aim of the paper was to recognize the value of the reservoir to investigate the feasibility of working on the reservoir modeling prior to the final decision of the investment for further development of this oilfield. Well log, deviation survey, 2D/3D interpreted seismic structural maps, facies, and core test were utilized to construct the developed geological model based on comprehensive interpretation and correlation processes using the PETREL platform. The geological model mainly aims to estimate stock-tank oil initially in place of the reservoir. In addition, three scenarios were applie
... Show MoreThe study dealt with measuring the impact of the availability of each of the content elements of the interaction, electronic services, and information on the evaluation of the users of the government website for its effectiveness in terms of the site’s functions and for measuring the site’s ability to present the organization’s tasks to customer groups.
Authority is concerned with measuring the confidence of customers in the content of the site, and in the organization as a whole. Validity is related to measuring the effectiveness of employing the site’s content to achieve the goal of its creation and in communicating with customers. Availability It is for measuring the ease of use of the site. The relevance, which means
... Show MoreThe convolutional neural networks (CNN) are among the most utilized neural networks in various applications, including deep learning. In recent years, the continuing extension of CNN into increasingly complicated domains has made its training process more difficult. Thus, researchers adopted optimized hybrid algorithms to address this problem. In this work, a novel chaotic black hole algorithm-based approach was created for the training of CNN to optimize its performance via avoidance of entrapment in the local minima. The logistic chaotic map was used to initialize the population instead of using the uniform distribution. The proposed training algorithm was developed based on a specific benchmark problem for optical character recog
... Show More