Flexible job-shop scheduling problem (FJSP) is one of the instances in flexible manufacturing systems. It is considered as a very complex to control. Hence generating a control system for this problem domain is difficult. FJSP inherits the job-shop scheduling problem characteristics. It has an additional decision level to the sequencing one which allows the operations to be processed on any machine among a set of available machines at a facility. In this article, we present Artificial Fish Swarm Algorithm with Harmony Search for solving the flexible job shop scheduling problem. It is based on the new harmony improvised from results obtained by artificial fish swarm algorithm. This improvised solution is sent to comparison to an overall best solution. When it is the better one, it replaces with the artificial fish swarm solution from which this solution was improvised. Meanwhile the best improvised solutions are carried over to the Harmony Memory. The objective is to minimize a total completion time (makespan) and to make the proposed approach as a portion of the expert and the intelligent scheduling system for remanufacturing decision support. Harmony search algorithm has demonstrated to be efficient, simple and strong optimization algorithm. The ability of exploration in any optimization algorithm is one of the key points. The obtained optimization results show that the proposed algorithm provides better exploitation ability and enjoys fast convergence to the optimum solution. As well, comparisons with the original artificial fish swarm algorithm demonstrate improved efficiency.
he current research aims to analyze the relationship and the level of influence of labor relations in reducing the cases of job withdrawal in private colleges in Baghdad. (265) individuals from department heads, university professors, and teaching staff in (7) private colleges located in the capital, Baghdad, and based on the Stephen Thompson equation for small samples, the sample size was determined by (157) teachers, and the questionnaire was adopted as a main tool for collecting data and information After ensuring the validity and reliability of its contents, and to test the relationship of influence, correlation and interaction between the research variables, two main hypotheses were formulated from which (5) sub-hypotheses eman
... Show MoreThis study aimed new indications that may clarify the relationships between the total and standard lengths, and the length of the otolith, as well as the thickness and weight of these bones compared to the body weights of two different species of invasive fish in the Iraqi aquatic environment, the common carp
Nowadays, information systems constitute a crucial part of organizations; by losing security, these organizations will lose plenty of competitive advantages as well. The core point of information security (InfoSecu) is risk management. There are a great deal of research works and standards in security risk management (ISRM) including NIST 800-30 and ISO/IEC 27005. However, only few works of research focus on InfoSecu risk reduction, while the standards explain general principles and guidelines. They do not provide any implementation details regarding ISRM; as such reducing the InfoSecu risks in uncertain environments is painstaking. Thus, this paper applied a genetic algorithm (GA) for InfoSecu risk reduction in uncertainty. Finally, the ef
... Show MoreImplementation of TSFS (Transposition, Substitution, Folding, and Shifting) algorithm as an encryption algorithm in database security had limitations in character set and the number of keys used. The proposed cryptosystem is based on making some enhancements on the phases of TSFS encryption algorithm by computing the determinant of the keys matrices which affects the implementation of the algorithm phases. These changes showed high security to the database against different types of security attacks by achieving both goals of confusion and diffusion.
In this paper, the botnet detection problem is defined as a feature selection problem and the genetic algorithm (GA) is used to search for the best significant combination of features from the entire search space of set of features. Furthermore, the Decision Tree (DT) classifier is used as an objective function to direct the ability of the proposed GA to locate the combination of features that can correctly classify the activities into normal traffics and botnet attacks. Two datasets namely the UNSW-NB15 and the Canadian Institute for Cybersecurity Intrusion Detection System 2017 (CICIDS2017), are used as evaluation datasets. The results reveal that the proposed DT-aware GA can effectively find the relevant features from
... Show MoreObjective(s): This study aimed to evaluate job satisfaction among nurses working at primary health care centers in Samawa City.
Methodology: A Descriptive evolutional study has been carried out during the period from 1 February 2022 to 1 June 2022. A nonprobability (convenience) sample of (200) nurse were selected from different educational level. A questionnaire format is developed for the purpose of fulfilling the objectives of the study. Content validity of the questionnaire and reliability has been determined. Data are analyzed using IBM SPSS version 19 software (2010).
Results: The findings indicate that 52% of nurses are showing high level of job satisfa
... Show MoreSeven fish species were collected from the drainage network at Al-Madaen region, south of
Baghdad with the aid of a cast net during the period from March to August 1993. These fishes
were infected with 22 parasite species (seven sporozoans, three ciliated protozoans, seven
monogeneans, two nematodes, one acanthocephalan and two crustaceans) and one fungus
species. Among such parasites, Chloromyxum wardi and Cystidicola sp. are reported here for
the first time in Iraq. In addition, 11 new host records are added to the list of parasites of
fishes of Iraq.
This paper presents the application of a framework of fast and efficient compressive sampling based on the concept of random sampling of sparse Audio signal. It provides four important features. (i) It is universal with a variety of sparse signals. (ii) The number of measurements required for exact reconstruction is nearly optimal and much less then the sampling frequency and below the Nyquist frequency. (iii) It has very low complexity and fast computation. (iv) It is developed on the provable mathematical model from which we are able to quantify trade-offs among streaming capability, computation/memory requirement and quality of reconstruction of the audio signal. Compressed sensing CS is an attractive compression scheme due to its uni
... Show More