The presented study investigated the scheduling regarding jobs on a single machine. Each job will be processed with no interruptions and becomes available for the processing at time 0. The aim is finding a processing order with regard to jobs, minimizing total completion time , total late work , and maximal tardiness which is an NP-hard problem. In the theoretical part of the present work, the mathematical formula for the examined problem will be presented, and a sub-problem of the original problem of minimizing the multi-objective functions is introduced. Also, then the importance regarding the dominance rule (DR) that could be applied to the problem to improve good solutions will be shown. While in the practical part, two exact methods are important; a Branch and Bound algorithm (BAB) and a complete enumeration (CEM) method are applied to solve the three proposed MSP criteria by finding a set of efficient solutions. The experimental results showed that CEM can solve problems for up to jobs. Two approaches of the BAB method were applied: the first approach was BAB without dominance rule (DR), and the BAB method used dominance rules to reduce the number of sequences that need to be considered. Also, this method can solve problems for up to , and the second approach BAB with dominance rule (DR), can solve problems for up to jobs in a reasonable time to find efficient solutions to this problem. In addition, to find good approximate solutions, two heuristic methods for solving the problem are proposed, the first heuristic method can solve up to jobs, while the second heuristic method can solve up to jobs. Practical experiments prove the good performance regarding the two suggested approaches for the original problem. While for a sub-problem the experimental results showed that CEM can solve problems for up to jobs, the BAB without dominance rule (DR) can solve problems for up to , and the second approach BAB with dominance rule (DR), can solve problems for up to jobs in a reasonable time to find efficient solutions to this problem. Finally, the heuristic method can solve up to jobs. Arithmetic results are calculated by coding (programming) algorithms using (MATLAB 2019a)
The esterification reaction of ethyl alcohol and acetic acid catalyzed by the ion exchange resin, Amberlyst 15, was investigated. The experimental study was implemented in an isothermal batch reactor. Catalyst loading, initial molar ratio, mixing time and temperature as being the most effective parameters, were extensively studied and discussed. A maximum final conversion of 75% was obtained at 70°C, acid to ethyl alcohol mole ratio of 1/2 and 10 g catalyst loading. Kinetic of the reaction was correlated with Langmuir-Hanshelwood model (LHM). The total rate constant and the adsorption equilibrium of water as a function of the temperature was calculated. The activation energies were found to be as 113876.9 and -49474.95 KJ per Kmol of ac
... Show MoreData mining has the most important role in healthcare for discovering hidden relationships in big datasets, especially in breast cancer diagnostics, which is the most popular cause of death in the world. In this paper two algorithms are applied that are decision tree and K-Nearest Neighbour for diagnosing Breast Cancer Grad in order to reduce its risk on patients. In decision tree with feature selection, the Gini index gives an accuracy of %87.83, while with entropy, the feature selection gives an accuracy of %86.77. In both cases, Age appeared as the most effective parameter, particularly when Age<49.5. Whereas Ki67 appeared as a second effective parameter. Furthermore, K- Nearest Neighbor is based on the minimu
... Show MoreIn digital images, protecting sensitive visual information against unauthorized access is considered a critical issue; robust encryption methods are the best solution to preserve such information. This paper introduces a model designed to enhance the performance of the Tiny Encryption Algorithm (TEA) in encrypting images. Two approaches have been suggested for the image cipher process as a preprocessing step before applying the Tiny Encryption Algorithm (TEA). The step mentioned earlier aims to de-correlate and weaken adjacent pixel values as a preparation process before the encryption process. The first approach suggests an Affine transformation for image encryption at two layers, utilizing two different key sets for each layer. Th
... Show MoreAssessing the accuracy of classification algorithms is paramount as it provides insights into reliability and effectiveness in solving real-world problems. Accuracy examination is essential in any remote sensing-based classification practice, given that classification maps consistently include misclassified pixels and classification misconceptions. In this study, two imaginary satellites for Duhok province, Iraq, were captured at regular intervals, and the photos were analyzed using spatial analysis tools to provide supervised classifications. Some processes were conducted to enhance the categorization, like smoothing. The classification results indicate that Duhok province is divided into four classes: vegetation cover, buildings,
... Show MoreText based-image clustering (TBIC) is an insufficient approach for clustering related web images. It is a challenging task to abstract the visual features of images with the support of textual information in a database. In content-based image clustering (CBIC), image data are clustered on the foundation of specific features like texture, colors, boundaries, shapes. In this paper, an effective CBIC) technique is presented, which uses texture and statistical features of the images. The statistical features or moments of colors (mean, skewness, standard deviation, kurtosis, and variance) are extracted from the images. These features are collected in a one dimension array, and then genetic algorithm (GA) is applied for image clustering.
... Show MoreThe experimental and numerical analysis was performed on pipes suffering large plastic deformation through expanding them using rigid conical shaped mandrels, with three different cone angles (15◦, 25◦, 35◦) and diameters (15, 17, 20) mm. The experimental test for the strain results investigated the expanded areas. A numerical solution of the pipes expansion process was also investigated using the commercial finite element software ANSYS. The strains were measured for each case experimentally by stamping the mesh on the pipe after expanding, then compared with Ansys results. No cracks were generated during the process with the selected angles. It can be concluded that the strain decreased with greater angles of con
... Show MoreThis research has been prepared to isolate and diagnose one of the most important vegetable oils from the plant medical clove is the famous with Alaeugenol oil and used in many pharmaceuticals were the isolation process using a technique ultrasonic extraction and distillation technology simple
Regression models are one of the most important models used in modern studies, especially research and health studies because of the important results they achieve. Two regression models were used: Poisson Regression Model and Conway-Max Well- Poisson), where this study aimed to make a comparison between the two models and choose the best one between them using the simulation method and at different sample sizes (n = 25,50,100) and with repetitions (r = 1000). The Matlab program was adopted.) to conduct a simulation experiment, where the results showed the superiority of the Poisson model through the mean square error criterion (MSE) and also through the Akaiki criterion (AIC) for the same distribution.
Paper type:
... Show MoreWith the escalation of cybercriminal activities, the demand for forensic investigations into these crimeshas grown significantly. However, the concept of systematic pre-preparation for potential forensicexaminations during the software design phase, known as forensic readiness, has only recently gainedattention. Against the backdrop of surging urban crime rates, this study aims to conduct a rigorous andprecise analysis and forecast of crime rates in Los Angeles, employing advanced Artificial Intelligence(AI) technologies. This research amalgamates diverse datasets encompassing crime history, varioussocio-economic indicators, and geographical locations to attain a comprehensive understanding of howcrimes manifest within the city. Lev
... Show MoreA liquid-solid chromatography of Bovine Serum Albumin (BSA) on (diethylaminoethyl-cellulose) DEAE-cellulose adsorbent is worked experimentally, to study the effect of changing the influent concentration of (0.125, 0.25, 0.5, and 1 mg/ml) at constant volumetric flow rate Q=1ml/min. And the effect of changing the volumetric flow rate (1, 3, 5, and 10 ml/min) at constant influent concentration of Co=0.125mg/ml. By using a glass column of (1.5cm) I.D and (50cm) length, packed with adsorbent of DEAE-cellulose of height (7cm). The influent is introduced in to the column using peristaltic pump and the effluent concentration is investigated using UV-spectrophotometer at 30oC and 280nm wavelength. A spread (steeper) break-through curve is gained
... Show More