The emphasis of Master Production Scheduling (MPS) or tactic planning is on time and spatial disintegration of the cumulative planning targets and forecasts, along with the provision and forecast of the required resources. This procedure eventually becomes considerably difficult and slow as the number of resources, products and periods considered increases. A number of studies have been carried out to understand these impediments and formulate algorithms to optimise the production planning problem, or more specifically the master production scheduling (MPS) problem. These algorithms include an Evolutionary Algorithm called Genetic Algorithm, a Swarm Intelligence methodology called Gravitational Search Algorithm (GSA), Bat Algorithm (BAT), Teaching Learning Based Algorithm and Harmony Search Algorithm (GA, TLBO and HS). This segment of the research constitutes of two parts the first emphasises on the outcome of the five simulation algorithms and then identifies the best, while the second part is about comparing the best with the search algorithm work in 2009, the applicability of BAT Algorithm and Gravitational Search Algorithm (GSABAT) to planning problems, the technique used for calculating master production scheduling, and the very important results and recommendations for future studies
In recent years, the performance of Spatial Data Infrastructures for governments and companies is a task that has gained ample attention. Different categories of geospatial data such as digital maps, coordinates, web maps, aerial and satellite images, etc., are required to realize the geospatial data components of Spatial Data Infrastructures. In general, there are two distinct types of geospatial data sources exist over the Internet: formal and informal data sources. Despite the growth of informal geospatial data sources, the integration between different free sources is not being achieved effectively. The adoption of this task can be considered the main advantage of this research. This article addresses the research question of how the
... Show MoreCrime is considered as an unlawful activity of all kinds and it is punished by law. Crimes have an impact on a society's quality of life and economic development. With a large rise in crime globally, there is a necessity to analyze crime data to bring down the rate of crime. This encourages the police and people to occupy the required measures and more effectively restricting the crimes. The purpose of this research is to develop predictive models that can aid in crime pattern analysis and thus support the Boston department's crime prevention efforts. The geographical location factor has been adopted in our model, and this is due to its being an influential factor in several situations, whether it is traveling to a specific area or livin
... Show MoreColor image compression is a good way to encode digital images by decreasing the number of bits wanted to supply the image. The main objective is to reduce storage space, reduce transportation costs and maintain good quality. In current research work, a simple effective methodology is proposed for the purpose of compressing color art digital images and obtaining a low bit rate by compressing the matrix resulting from the scalar quantization process (reducing the number of bits from 24 to 8 bits) using displacement coding and then compressing the remainder using the Mabel ZF algorithm Welch LZW. The proposed methodology maintains the quality of the reconstructed image. Macroscopic and
Estimating the semantic similarity between short texts plays an increasingly prominent role in many fields related to text mining and natural language processing applications, especially with the large increase in the volume of textual data that is produced daily. Traditional approaches for calculating the degree of similarity between two texts, based on the words they share, do not perform well with short texts because two similar texts may be written in different terms by employing synonyms. As a result, short texts should be semantically compared. In this paper, a semantic similarity measurement method between texts is presented which combines knowledge-based and corpus-based semantic information to build a semantic network that repre
... Show MoreBackground: The risk of antibiotics resistance (AR) increases due to excessive of antibiotics either by health care provider or by the patients.
Objective: The assessment of the self-medication Practice of over the counter drugs and other prescription drugs and its associated risk factor.
Subjects and Methods: Study design: A descriptive study was conducted from “20th December 2019 to 08th January 2021”. A pre validated and structured questionnaire in English and Urdu language was created to avoid language barrier including personal detail, reasons and source and knowledge about over the counter drugs and Antibiotics. Sample of the study was randomly selected.
... Show MoreRobots have become an essential part of modern industries in welding departments to increase the accuracy and rate of production. The intelligent detection of welding line edges to start the weld in a proper position is very important. This work introduces a new approach using image processing to detect welding lines by tracking the edges of plates according to the required speed by three degrees of a freedom robotic arm. The two different algorithms achieved in the developed approach are the edge detection and top-hat transformation. An adaptive neuro-fuzzy inference system ANFIS was used to choose the best forward and inverse kinematics of the robot. MIG welding at the end-effector was applied as a tool in this system, and the wel
... Show MoreRegarding to the computer system security, the intrusion detection systems are fundamental components for discriminating attacks at the early stage. They monitor and analyze network traffics, looking for abnormal behaviors or attack signatures to detect intrusions in early time. However, many challenges arise while developing flexible and efficient network intrusion detection system (NIDS) for unforeseen attacks with high detection rate. In this paper, deep neural network (DNN) approach was proposed for anomaly detection NIDS. Dropout is the regularized technique used with DNN model to reduce the overfitting. The experimental results applied on NSL_KDD dataset. SoftMax output layer has been used with cross entropy loss funct
... Show MoreRobots have become an essential part of modern industries in welding departments to increase the accuracy and rate of production. The intelligent detection of welding line edges to start the weld in a proper position is very important. This work introduces a new approach using image processing to detect welding lines by tracking the edges of plates according to the required speed by three degrees of a freedom robotic arm. The two different algorithms achieved in the developed approach are the edge detection and top-hat transformation. An adaptive neuro-fuzzy inference system ANFIS was used to choose the best forward and inverse kinematics of the robot. MIG welding at the end-effector was applied as a tool in this system, and the wel
... Show MoreThe Evolution Of Information Technology And The Use Of Computer Systems Led To Increase Attention To The Use Of Modern Techniques In The Auditing Process , As It Will Overcome Some Of The Human Shortcomings In The Exercise Of Professional Judgment, Then It Can Improve The Efficiency And Effectiveness Of The Audit Process, Where The New Audit Methodologies Espouse The Concept Of Risk Which Includes Strategic Dimension With Regard To The Capacity Of The Entity To Achieve Its Goals, Which Requires Auditors To Rely On Advanced Technology That Can Identify The Factors Which Prevent The Entity From Achieving Its Objectives. The Idea Of Research Is To Preparing An Electronic Program Fer All Audit Work From Planning Through Sampling And Document
... Show More