Scheduling Timetables for courses in the big departments in the universities is a very hard problem and is often be solved by many previous works although results are partially optimal. This work implements the principle of an evolutionary algorithm by using genetic theories to solve the timetabling problem to get a random and full optimal timetable with the ability to generate a multi-solution timetable for each stage in the collage. The major idea is to generate course timetables automatically while discovering the area of constraints to get an optimal and flexible schedule with no redundancy through the change of a viable course timetable. The main contribution in this work is indicated by increasing the flexibility of generating optimal timetable schedules with different copies by increasing the probability of giving the best schedule for each stage in the campus with the ability to replace the timetable when needed. The Evolutionary Algorithm (EA) utilized in this paper is the Genetic Algorithm (GA) which is a common multi-solution metaheuristic search based on the evolutionary population that can be applied to solve complex combinatorial problems like timetabling problems. In this work, all inputs: courses, teachers, and time acted by one array to achieve local search and combined this acting of the timetable by using the heuristic crossover to ensure that the essential conditions are not broken. The result of this work is a flexible scheduling system, which shows the diversity of all possible timetables that can be created depending on user conditions and needs.
The main aim of this paper is to use the notion which was introduced in [1], to offered new classes of separation axioms in ideal spaces. So, we offered new type of notions of convergence in ideal spaces via the set. Relations among several types of separation axioms that offered were explained.
The Estimation Of The Reliability Function Depends On The Accuracy Of The Data Used To Estimate The Parameters Of The Probability distribution, and Because Some Data Suffer from a Skew in their Data to Estimate the Parameters and Calculate the Reliability Function in light of the Presence of Some Skew in the Data, there must be a Distribution that has flexibility in dealing with that Data. As in the data of Diyala Company for Electrical Industries, as it was observed that there was a positive twisting in the data collected from the Power and Machinery Department, which required distribution that deals with those data and searches for methods that accommodate this problem and lead to accurate estimates of the reliability function,
... Show MoreThis study includes the application of non-parametric methods in estimating the conditional survival function of the Beran method using both the Nadaraya-Waston and the Priestley-chao weights and using data for Interval censored and Right censored of breast cancer and two types of treatment, Chemotherapy and radiation therapy Considering age is continuous variable, through using (MATLAB) use of the (MSE) To compare weights The results showed a superior weight (Nadaraya-Waston) in estimating the survival function and condition of Both for chemotherapy and radiation therapy.
A condense study was done to compare between the ordinary estimators. In particular the maximum likelihood estimator and the robust estimator, to estimate the parameters of the mixed model of order one, namely ARMA(1,1) model.
Simulation study was done for a varieties the model. using: small, moderate and large sample sizes, were some new results were obtained. MAPE was used as a statistical criterion for comparison.
Many of the key stream generators which are used in practice are LFSR-based in the sense that they produce the key stream according to a rule y = C(L(x)), where L(x) denotes an internal linear bit stream, produced by small number of parallel linear feedback shift registers (LFSRs), and C denotes some nonlinear compression function. In this paper we combine between the output sequences from the linear feedback shift registers with the sequences out from non linear key generator to get the final very strong key sequence
The result of concentration varying of mixture methane with argon and neon gas are believed to study the change in electrons energy distribution function and then the change of the electrons transport parameters including the drift velocity, the mean energy, characteristics energy and diffusion coefficient. In the present work,a contemporary developed computer, simulation program known as Bolsig+ is being used for calculating the electron transport parameters.
In this paper, we used four classification methods to classify objects and compareamong these methods, these are K Nearest Neighbor's (KNN), Stochastic Gradient Descentlearning (SGD), Logistic Regression Algorithm(LR), and Multi-Layer Perceptron (MLP). Weused MCOCO dataset for classification and detection the objects, these dataset image wererandomly divided into training and testing datasets at a ratio of 7:3, respectively. In randomlyselect training and testing dataset images, converted the color images to the gray level, thenenhancement these gray images using the histogram equalization method, resize (20 x 20) fordataset image. Principal component analysis (PCA) was used for feature extraction, andfinally apply four classification metho
... Show MoreAbstract
An experimental study was conducted for measuring the quality of surface finishing roughness using magnetic abrasive finishing technique (MAF) on brass plate which is very difficult to be polish by a conventional machining process where the cost is high and much more susceptible to surface damage as compared to other materials. Four operation parameters were studied, the gap between the work piece and the electromagnetic inductor, the current that generate the flux, the rotational Spindale speed and amount of abrasive powder size considering constant linear feed movement between machine head and workpiece. Adaptive Neuro fuzzy inference system (ANFIS) was implemented for evaluation of a serie
... Show MoreEssential approaches involving photons are among the most common uses of parallel optical computation due to their recent invention, ease of production, and low cost. As a result, most researchers have concentrated their efforts on it. The Basic Arithmetic Unit BAU is built using a three-step approach that uses optical gates with three states to configure the circuitry for addition, subtraction, and multiplication. This is a new optical computing method based on the usage of a radix of (2): a binary number with a signed-digit (BSD) system that includes the numbers -1, 0, and 1. Light with horizontal polarization (LHP) (↔), light with no intensity (LNI) (⥀), and light with vertical polarization (LVP) (↨) is represen
... Show MoreThe efficient sequencing techniques have significantly increased the number of genomes that are now available, including the Crenarchaeon Sulfolobus solfataricus P2 genome. The genome-scale metabolic pathways in Sulfolobus solfataricus P2 were predicted by implementing the “Pathway Tools†software using MetaCyc database as reference knowledge base. A Pathway/Genome Data Base (PGDB) specific for Sulfolobus solfataricus P2 was created. A curation approach was carried out regarding all the amino acids biosynthetic pathways. Experimental literatures as well as homology-, orthology- and context-based protein function prediction methods were followed for the curation process. The “PathoLogicâ€
... Show More