Optimizing system performance in dynamic and heterogeneous environments and the efficient management of computational tasks are crucial. This paper therefore looks at task scheduling and resource allocation algorithms in some depth. The work evaluates five algorithms: Genetic Algorithms (GA), Particle Swarm Optimization (PSO), Ant Colony Optimization (ACO), Firefly Algorithm (FA) and Simulated Annealing (SA) across various workloads achieved by varying the task-to-node ratio. The paper identifies Finish Time and Deadline as two key performance metrics for gauging the efficacy of an algorithm, and a comprehensive investigation of the behaviors of these algorithms across different workloads was carried out. Results from the experiments reveal unique patterns in algorithmic behaviors by workload. In the 15-task and 5-node scenario, the GA and PSO algorithms outclass all others, completing 100 percent of tasks before deadlines, Task 5 was a bane to the ACO algorithm. The study proposes a more extensive system that promotes an adaptive algorithmic approach based on workload characteristics. Numerically, the GA and PSO algorithms triumphed completing 100 percent of tasks before their deadlines in the face of 10 tasks and 5 nodes, while the ACO algorithm stumbled on certain tasks. As it is stated in the study, The above-mentioned system offers an integrated approach to ill-structured problem of task scheduling and resource allocation. It offers an intelligent and aggressive scheduling scheme that runs asynchronously when a higher number of tasks is submitted for the completion in addition to those dynamically aborts whenever system load and utilization cascade excessively. The proposed design seems like full-fledged solution over project scheduling or resource allocation issues. It highlights a detailed method of the choice of algorithms based on semantic features, aiming at flexibility. Effects of producing quantifiable statistical results from the experiments on performance empirically demonstrate each algorithm performed under various settings.
One of the serious problems in any wireless communication system using multi carrier modulation technique like Orthogonal Frequency Division Multiplexing (OFDM) is its Peak to Average Power Ratio (PAPR).It limits the transmission power due to the limitation of dynamic range of Analog to Digital Converter and Digital to Analog Converter (ADC/DAC) and power amplifiers at the transmitter, which in turn sets the limit over maximum achievable rate.
This issue is especially important for mobile terminals to sustain longer battery life time. Therefore reducing PAPR can be regarded as an important issue to realize efficient and affordable mobile communication services.
... Show More
Recently digital mammography is most widely used technology for early detection of breast cancer. The main diagnosing elements such as lesion or masses in digital mammograms are with law contrast. The purpose of this paper is to enhance the mammogram images by increasing its contrast. Different enhancement method are used for this purpose such as histogram equalization (HE), Contrast Limited Adaptive Histogram Equalization (CLAHE), Morphological, and Retinex. The Retinex method also implement by combining it with HE once, and with CLAHE to improve its performance. The experimental results show that using Retinex with CLAHE can produce an image with enhancement in contrast better than using it with HE method and better than other methods
... Show MoreCompression of speech signal is an essential field in signal processing. Speech compression is very important in today’s world, due to the limited bandwidth transmission and storage capacity. This paper explores a Contourlet transformation based methodology for the compression of the speech signal. In this methodology, the speech signal is analysed using Contourlet transformation coefficients with statistic methods as threshold values, such as Interquartile Filter (IQR), Average Absolute Deviation (AAD), Median Absolute Deviation (MAD) and standard deviation (STD), followed by the application of (Run length encoding) They are exploited for recording speech in different times (5, 30, and 120 seconds). A comparative study of performance
... Show MoreIn this paper, we made comparison among different parametric ,nonparametric and semiparametric estimators for partial linear regression model users parametric represented by ols and nonparametric methods represented by cubic smoothing spline estimator and Nadaraya-Watson estimator, we study three nonparametric regression models and samples sizes n=40,60,100,variances used σ2=0.5,1,1.5 the results for the first model show that N.W estimator for partial linear regression model(PLM) is the best followed the cubic smoothing spline estimator for (PLM),and the results of the second and the third model show that the best estimator is C.S.S.followed by N.W estimator for (PLM) ,the
... Show MoreChromium oxide nanoparticles were synthesized using cauliflower extract by two methods: simple chemical method and the sol-gel method. These technologies are new, environmentally friendly and cheap. Cauliflower contains plant materials and biomolecules (chromium, phenols, alkalis, vitamins, amino acids, quinones, etc. (that convert chromium chloride hexahydrate (CrCl3.6H2O) into chromium nanoparticles. The plant extracts also act as diluents, stabilizers and anti-caking agents. X-ray diffraction (XRD) analysis showed that the size of the crystals decreased from (36.1 to 57.8) nm using the simple chemical method to (13.31 to 20.68) nm of Cr2O3 using sol-gel.
... Show MoreIn this paper, we derived an estimator of reliability function for Laplace distribution with two parameters using Bayes method with square error loss function, Jeffery’s formula and conditional probability random variable of observation. The main objective of this study is to find the efficiency of the derived Bayesian estimator compared to the maximum likelihood of this function and moment method using simulation technique by Monte Carlo method under different Laplace distribution parameters and sample sizes. The consequences have shown that Bayes estimator has been more efficient than the maximum likelihood estimator and moment estimator in all samples sizes
Microfibers released by synthetic clothes have a significant negative effect on the environment. Several solutions have been proposed and evaluated for their effectiveness, but studies have failed to address the human-centered aspects of these products. In this research, the possibilities and needs from a consumer perspective for a new filtering system for domestic washing machines were examined. First, a quantitative (questionnaire) and a qualitative (interviews and observations) exploration were done to understand the desired requirements from a user perspective. Next, the acceptance of various existing solutions for microfiber catching was investigated. To verify these requirements, a new concept was designed and evaluated with a
... Show MoreMost of the world is seeking attention to the agricultural sector, which occupies a distinguished economic center, including Iraq, it is no wonder that the focus is on this sector, especially in developing countries. Descriptive analysis of the sample tables of wheat and barley crops in Iraq showed that adherence to the principles and rules of total quality, including modern irrigation methods (irrigated) has a significant impact in increasing productivity, reducing costs and improving quality compared to traditional irrigation methods, so it became necessary to take agricultural economic units All procedures and means that will help in the application of the rules of total quality to promote the reality of wheat and barley cultivation in I
... Show MoreThe Twofish cipher is a very powerful algorithm with a fairly complex structure that permeates most data parsing and switching and can be easily implemented. The keys of the Twofish algorithm are of variable length (128, 192, or 256 bits), and the key schedule is generated once and repeated in encrypting all message blocks, whatever their number, and this reduces the confidentiality of encryption. This article discusses the process of generating cipher keys for each block. This concept is new and unknown in all common block cipher algorithms. It is based on the permanent generation of sub keys for all blocks and the key generation process, each according to its work. The Geffe's Generator is used to generate subkeys to make eac
... Show More