The rapid and enormous growth of the Internet of Things, as well as its widespread adoption, has resulted in the production of massive quantities of data that must be processed and sent to the cloud, but the delay in processing the data and the time it takes to send it to the cloud has resulted in the emergence of fog, a new generation of cloud in which the fog serves as an extension of cloud services at the edge of the network, reducing latency and traffic. The distribution of computational resources to minimize makespan and running costs is one of the disadvantages of fog computing. This paper provides a new approach for improving the task scheduling problem in a Cloud-Fog environment in terms of execution time(makespan) and operating costs for Bag-of-Tasks applications. A task scheduling evolutionary algorithm has been proposed. A single custom representation of the problem and a uniform intersection are built for the proposed algorithm. Furthermore, the individual initialization and perturbation operators (crossover and mutation) were created to resolve the inapplicability of any solution found or reached by the proposed evolutionary algorithm. The proposed ETS (Evolutionary Task Scheduling algorithm) algorithm was evaluated on 11 datasets of varying size in a number of tasks. The ETS outperformed the Bee Life (BLA), Modified Particle Swarm (MPSO), and RR algorithms in terms of Makespan and operating costs, according to the results of the experiments.
The performance of the Indian wheat varieties (PBW34, PBW343 and WH542) was compared with the local varieties (Tamose – 2 and Abu Ghareeb). The experiment was conducted during two seasons first 2000-2001 and second 2001-2002 at AL- Twaitha experimental station , middle of Iraq. The results showed that (Tamose – 2) exceeded the other varieties in plant height and heading date in the two seasons, while WH542 gave the lowest plant height. PBW34 variety showed a significant increase in 1000 grain weight followed by PBW343 in the second season. Moreover, PBW34 and PBW343 gave the highest average of 50 spike weight in the second season. 
... Show MoreThis study Achieved to search form the infections by Cholera and Diarrhea in two different areas from the side of Cultural, Social, Economical and Environmental field in Baghdad governorate, during a period from 3/10 – 3/12/2007.these were in Obiady city and Palestine street. This study included groups of patients who went to the Kindy Hospital lab. The researcher use a sample of (300) persons of different ages with range (150) persons in each city from the study city, in this study show a great different in the percentage of infection by parasites, helminthes, viruses, bacteria and vibrio cholera in the two city according to age groups, reach upper percentage by infection in age (1-10) years in Obiady city with percent (57.5 %) wh
... Show MoreThe lossy-FDNR based aclive fil ter has an important property among many design realizations. 'This includes a significant reduction in component count particularly in the number of OP-AMP which consumes power. However the· problem of this type is the large component spreads which affect the fdter performance.
In this paper Genetic Algorithm is applied to minimize the component spread (capacitance and resistance p,read). The minimization of these spreads allow the fil
... Show MoreMost heuristic search method's performances are dependent on parameter choices. These parameter settings govern how new candidate solutions are generated and then applied by the algorithm. They essentially play a key role in determining the quality of the solution obtained and the efficiency of the search. Their fine-tuning techniques are still an on-going research area. Differential Evolution (DE) algorithm is a very powerful optimization method and has become popular in many fields. Based on the prolonged research work on DE, it is now arguably one of the most outstanding stochastic optimization algorithms for real-parameter optimization. One reason for its popularity is its widely appreciated property of having only a small number of par
... Show MoreAssociation rules mining (ARM) is a fundamental and widely used data mining technique to achieve useful information about data. The traditional ARM algorithms are degrading computation efficiency by mining too many association rules which are not appropriate for a given user. Recent research in (ARM) is investigating the use of metaheuristic algorithms which are looking for only a subset of high-quality rules. In this paper, a modified discrete cuckoo search algorithm for association rules mining DCS-ARM is proposed for this purpose. The effectiveness of our algorithm is tested against a set of well-known transactional databases. Results indicate that the proposed algorithm outperforms the existing metaheuristic methods.
Determining the face of wearing a mask from not wearing a mask from visual data such as video and still, images have been a fascinating research topic in recent decades due to the spread of the Corona pandemic, which has changed the features of the entire world and forced people to wear a mask as a way to prevent the pandemic that has calmed the entire world, and it has played an important role. Intelligent development based on artificial intelligence and computers has a very important role in the issue of safety from the pandemic, as the Topic of face recognition and identifying people who wear the mask or not in the introduction and deep education was the most prominent in this topic. Using deep learning techniques and the YOLO (”You on
... Show MoreFor businesses that provide delivery services, the efficiency of the delivery process in terms of punctuality is very important. In addition to increasing customer trust, efficient route management, and selection are required to reduce vehicle fuel costs and expedite delivery. Some small and medium businesses still use conventional methods to manage delivery routes. Decisions to manage delivery schedules and routes do not use any specific methods to expedite the delivery settlement process. This process is inefficient, takes a long time, increases costs and is prone to errors. Therefore, the Dijkstra algorithm has been used to improve the delivery management process. A delivery management system was developed to help managers and drivers
... Show MoreThe Internet is providing vital communications between millions of individuals. It is also more and more utilized as one of the commerce tools; thus, security is of high importance for securing communications and protecting vital information. Cryptography algorithms are essential in the field of security. Brute force attacks are the major Data Encryption Standard attacks. This is the main reason that warranted the need to use the improved structure of the Data Encryption Standard algorithm. This paper proposes a new, improved structure for Data Encryption Standard to make it secure and immune to attacks. The improved structure of Data Encryption Standard was accomplished using standard Data Encryption Standard with a new way of two key gene
... Show More