Optimizing system performance in dynamic and heterogeneous environments and the efficient management of computational tasks are crucial. This paper therefore looks at task scheduling and resource allocation algorithms in some depth. The work evaluates five algorithms: Genetic Algorithms (GA), Particle Swarm Optimization (PSO), Ant Colony Optimization (ACO), Firefly Algorithm (FA) and Simulated Annealing (SA) across various workloads achieved by varying the task-to-node ratio. The paper identifies Finish Time and Deadline as two key performance metrics for gauging the efficacy of an algorithm, and a comprehensive investigation of the behaviors of these algorithms across different workloads was carried out. Results from the experiments reveal unique patterns in algorithmic behaviors by workload. In the 15-task and 5-node scenario, the GA and PSO algorithms outclass all others, completing 100 percent of tasks before deadlines, Task 5 was a bane to the ACO algorithm. The study proposes a more extensive system that promotes an adaptive algorithmic approach based on workload characteristics. Numerically, the GA and PSO algorithms triumphed completing 100 percent of tasks before their deadlines in the face of 10 tasks and 5 nodes, while the ACO algorithm stumbled on certain tasks. As it is stated in the study, The above-mentioned system offers an integrated approach to ill-structured problem of task scheduling and resource allocation. It offers an intelligent and aggressive scheduling scheme that runs asynchronously when a higher number of tasks is submitted for the completion in addition to those dynamically aborts whenever system load and utilization cascade excessively. The proposed design seems like full-fledged solution over project scheduling or resource allocation issues. It highlights a detailed method of the choice of algorithms based on semantic features, aiming at flexibility. Effects of producing quantifiable statistical results from the experiments on performance empirically demonstrate each algorithm performed under various settings.
The reserve estimation process is continuous during the life of the field due to risk and inaccuracy that are considered an endemic problem thereby must be studied. Furthermore, the truth and properly defined hydrocarbon content can be identified just only at the field depletion. As a result, reserve estimation challenge is a function of time and available data. Reserve estimation can be divided into five types: analogy, volumetric, decline curve analysis, material balance and reservoir simulation, each of them differs from another to the kind of data required. The choice of the suitable and appropriate method relies on reservoir maturity, heterogeneity in the reservoir and data acquisition required. In this research, three types of rese
... Show MoreIn the present work, heterojunction diode detectors will be prepared using germanium wafers as a substrate material and 200 nm tin sulfide thickness will be evaporated by using thermal evaporation method as thin film on the substrate. Nd:YAG laser (λ=532 nm) with different energy densities (5.66 J/cm2 and 11.32 J/cm2) is used to diffuse the SnS inside the surface of the germanium samples with 10 laser shots in different environments (vacuum and distilled water). I-V characteristics in the dark illumination, C-V characteristics, transmission measurements, spectral responsivity and quantum efficiency were investigated at 300K. The C-V measurements have shown that the heterojunction were of abrupt type and the maximum value of build-in pot
... Show MoreAs cities across the world grow and the mobility of populations increases, there has also been a corresponding increase in the number of vehicles on roads. The result of this has been a proliferation of challenges for authorities with regard to road traffic management. A consequence of this has been congestion of traffic, more accidents, and pollution. Accidents are a still major cause of death, despite the development of sophisticated systems for traffic management and other technologies linked with vehicles. Hence, it is necessary that a common system for accident management is developed. For instance, traffic congestion in most urban areas can be alleviated by the real-time planning of routes. However, the designing of an efficie
... Show MoreCyber security is a term utilized for describing a collection of technologies, procedures, and practices that try protecting an online environment of a user or an organization. For medical images among most important and delicate data kinds in computer systems, the medical reasons require that all patient data, including images, be encrypted before being transferred over computer networks by healthcare companies. This paper presents a new direction of the encryption method research by encrypting the image based on the domain of the feature extracted to generate a key for the encryption process. The encryption process is started by applying edges detection. After dividing the bits of the edge image into (3×3) windows, the diffusions
... Show MorePurpose – The Cloud computing (CC) and its services have enabled the information centers of organizations to adapt their informatic and technological infrastructure and making it more appropriate to develop flexible information systems in the light of responding to the informational and knowledge needs of their users. In this context, cloud-data governance has become more complex and dynamic, requiring an in-depth understanding of the data management strategy at these centers in terms of: organizational structure and regulations, people, technology, process, roles and responsibilities. Therefore, our paper discusses these dimensions as challenges that facing information centers in according to their data governance and the impa
... Show MoreThis work aims to develop a secure lightweight cipher algorithm for constrained devices. A secure communication among constrained devices is a critical issue during the data transmission from the client to the server devices. Lightweight cipher algorithms are defined as a secure solution for constrained devices that require low computational functions and small memory. In contrast, most lightweight algorithms suffer from the trade-off between complexity and speed in order to produce robust cipher algorithm. The PRESENT cipher has been successfully experimented on as a lightweight cryptography algorithm, which transcends other ciphers in terms of its computational processing that required low complexity operations. The mathematical model of
... Show MoreThis study aims to determine the prevalence of Entamoeba histolytica, Entamoeba dispar and
Entamoeba moshkovskii by three methods of diagnosis (microscopic examination, cultivation and PCR) that
were compared to obtain an accurate diagnosis of Entamoeba spp. during amoebiasis. Total (n=150) stool
samples related to patients were (n = 100) and healthy controls (n= 50). Clinically diagnosed stool samples
(n=100) were collected from patients attending the consultant clinics of different hospitals in Basrah during
the period from January 2018 to January 2019. The results showed that 60% of collected samples were
positive in a direct microscopic examination. All samples were cultivated on different media; the Bra
Canonical correlation analysis is one of the common methods for analyzing data and know the relationship between two sets of variables under study, as it depends on the process of analyzing the variance matrix or the correlation matrix. Researchers resort to the use of many methods to estimate canonical correlation (CC); some are biased for outliers, and others are resistant to those values; in addition, there are standards that check the efficiency of estimation methods.
In our research, we dealt with robust estimation methods that depend on the correlation matrix in the analysis process to obtain a robust canonical correlation coefficient, which is the method of Biwe
... Show More