Flexible job-shop scheduling problem (FJSP) is one of the instances in flexible manufacturing systems. It is considered as a very complex to control. Hence generating a control system for this problem domain is difficult. FJSP inherits the job-shop scheduling problem characteristics. It has an additional decision level to the sequencing one which allows the operations to be processed on any machine among a set of available machines at a facility. In this article, we present Artificial Fish Swarm Algorithm with Harmony Search for solving the flexible job shop scheduling problem. It is based on the new harmony improvised from results obtained by artificial fish swarm algorithm. This improvised solution is sent to comparison to an overall best solution. When it is the better one, it replaces with the artificial fish swarm solution from which this solution was improvised. Meanwhile the best improvised solutions are carried over to the Harmony Memory. The objective is to minimize a total completion time (makespan) and to make the proposed approach as a portion of the expert and the intelligent scheduling system for remanufacturing decision support. Harmony search algorithm has demonstrated to be efficient, simple and strong optimization algorithm. The ability of exploration in any optimization algorithm is one of the key points. The obtained optimization results show that the proposed algorithm provides better exploitation ability and enjoys fast convergence to the optimum solution. As well, comparisons with the original artificial fish swarm algorithm demonstrate improved efficiency.
this paper presents a novel method for solving nonlinear optimal conrol problems of regular type via its equivalent two points boundary value problems using the non-classical
The research aims to identify decent work and its impact in enhancing job immersion. The questionnaire was adopted as a tool to analyze the sample responses of (81) workers to represent an estimated response rate of (88 per cent) out of the total population of (92) individuals. The research adopted descriptive-analytical approach, and reliability calculation, arithmetic means standard deviations, relative importance, and regression analysis adopted on SPSS v.25. The conclusion shows that there is a medium correlation between decent work and job immersion, and there is a low impact of decent work with its dimensions in job immersion; extract the most important acceptable components for job from the sample point of view about the o
... Show MoreABSTRACT
This study aimed to choose top stocks through technical analysis tools specially the indicator called (ratio of William index), and test the ability of technical analysis tools in building a portfolio of shares efficient in comparison with the market portfolio. These one technical tools were used for building one portfolios in 21 companies on specific preview conditions and choose 10 companies for the period from (March 2015) to (June 2017). Applied results of the research showed that Portfolio yield for companies selected according to the ratio of William index indicator (0.0406) that
... Show MoreIn this paper, a method for hiding cipher text in an image file is introduced . The
proposed method is to hide the cipher text message in the frequency domain of the image.
This method contained two phases: the first is embedding phase and the second is extraction
phase. In the embedding phase the image is transformed from time domain to frequency
domain using discrete wavelet decomposition technique (Haar). The text message encrypted
using RSA algorithm; then Least Significant Bit (LSB) algorithm used to hide secret message
in high frequency. The proposed method is tested in different images and showed success in
hiding information according to the Peak Signal to Noise Ratio (PSNR) measure of the the
original ima
Spraying pesticides is one of the most common procedures that is conducted to control pests. However, excessive use of these chemicals inversely affects the surrounding environments including the soil, plants, animals, and the operator itself. Therefore, researchers have been encouraged to...
The biometric-based keys generation represents the utilization of the extracted features from the human anatomical (physiological) traits like a fingerprint, retina, etc. or behavioral traits like a signature. The retina biometric has inherent robustness, therefore, it is capable of generating random keys with a higher security level compared to the other biometric traits. In this paper, an effective system to generate secure, robust and unique random keys based on retina features has been proposed for cryptographic applications. The retina features are extracted by using the algorithm of glowworm swarm optimization (GSO) that provides promising results through the experiments using the standard retina databases. Additionally, in order t
... Show MoreIt is an established fact that substantial amounts of oil usually remain in a reservoir after primary and secondary processes. Therefore; there is an ongoing effort to sweep that remaining oil. Field optimization includes many techniques. Horizontal wells are one of the most motivating factors for field optimization. The selection of new horizontal wells must be accompanied with the right selection of the well locations. However, modeling horizontal well locations by a trial and error method is a time consuming method. Therefore; a method of Artificial Neural Network (ANN) has been employed which helps to predict the optimum performance via proposed new wells locations by incorporatin
For several applications, it is very important to have an edge detection technique matching human visual contour perception and less sensitive to noise. The edge detection algorithm describes in this paper based on the results obtained by Maximum a posteriori (MAP) and Maximum Entropy (ME) deblurring algorithms. The technique makes a trade-off between sharpening and smoothing the noisy image. One of the advantages of the described algorithm is less sensitive to noise than that given by Marr and Geuen techniques that considered to be the best edge detection algorithms in terms of matching human visual contour perception.