Scheduling Timetables for courses in the big departments in the universities is a very hard problem and is often be solved by many previous works although results are partially optimal. This work implements the principle of an evolutionary algorithm by using genetic theories to solve the timetabling problem to get a random and full optimal timetable with the ability to generate a multi-solution timetable for each stage in the collage. The major idea is to generate course timetables automatically while discovering the area of constraints to get an optimal and flexible schedule with no redundancy through the change of a viable course timetable. The main contribution in this work is indicated by increasing the flexibility of generating optimal timetable schedules with different copies by increasing the probability of giving the best schedule for each stage in the campus with the ability to replace the timetable when needed. The Evolutionary Algorithm (EA) utilized in this paper is the Genetic Algorithm (GA) which is a common multi-solution metaheuristic search based on the evolutionary population that can be applied to solve complex combinatorial problems like timetabling problems. In this work, all inputs: courses, teachers, and time acted by one array to achieve local search and combined this acting of the timetable by using the heuristic crossover to ensure that the essential conditions are not broken. The result of this work is a flexible scheduling system, which shows the diversity of all possible timetables that can be created depending on user conditions and needs.
The basic concepts of some near open subgraphs, near rough, near exact and near fuzzy graphs are introduced and sufficiently illustrated. The Gm-closure space induced by closure operators is used to generalize the basic rough graph concepts. We introduce the near exactness and near roughness by applying the near concepts to make more accuracy for definability of graphs. We give a new definition for a membership function to find near interior, near boundary and near exterior vertices. Moreover, proved results, examples and counter examples are provided. The Gm-closure structure which suggested in this paper opens up the way for applying rich amount of topological facts and methods in the process of granular computing.
This paper deals with the modeling of a preventive maintenance strategy applied to a single-unit system subject to random failures.
According to this policy, the system is subjected to imperfect periodic preventive maintenance restoring it to ‘as good as new’ with probability
p and leaving it at state ‘as bad as old’ with probability q. Imperfect repairs are performed following failures occurring between consecutive
preventive maintenance actions, i.e the times between failures follow a decreasing quasi-renewal process with parameter a. Considering the
average durations of the preventive and corrective maintenance actions a
... Show MoreThis research includes the application of non-parametric methods in estimating the conditional survival function represented in a method (Turnbull) and (Generalization Turnbull's) using data for Interval censored of breast cancer and two types of treatment, Chemotherapy and radiation therapy and age is continuous variable, The algorithm of estimators was applied through using (MATLAB) and then the use average Mean Square Error (MSE) as amusement to the estimates and the results showed (generalization of Turnbull's) In estimating the conditional survival function and for both treatments ,The estimated survival of the patients does not show very large differences
... Show MoreThe main aim of this paper is to use the notion which was introduced in [1], to offered new classes of separation axioms in ideal spaces. So, we offered new type of notions of convergence in ideal spaces via the set. Relations among several types of separation axioms that offered were explained.
The Estimation Of The Reliability Function Depends On The Accuracy Of The Data Used To Estimate The Parameters Of The Probability distribution, and Because Some Data Suffer from a Skew in their Data to Estimate the Parameters and Calculate the Reliability Function in light of the Presence of Some Skew in the Data, there must be a Distribution that has flexibility in dealing with that Data. As in the data of Diyala Company for Electrical Industries, as it was observed that there was a positive twisting in the data collected from the Power and Machinery Department, which required distribution that deals with those data and searches for methods that accommodate this problem and lead to accurate estimates of the reliability function,
... Show MoreA condense study was done to compare between the ordinary estimators. In particular the maximum likelihood estimator and the robust estimator, to estimate the parameters of the mixed model of order one, namely ARMA(1,1) model.
Simulation study was done for a varieties the model. using: small, moderate and large sample sizes, were some new results were obtained. MAPE was used as a statistical criterion for comparison.
Many of the key stream generators which are used in practice are LFSR-based in the sense that they produce the key stream according to a rule y = C(L(x)), where L(x) denotes an internal linear bit stream, produced by small number of parallel linear feedback shift registers (LFSRs), and C denotes some nonlinear compression function. In this paper we combine between the output sequences from the linear feedback shift registers with the sequences out from non linear key generator to get the final very strong key sequence
The result of concentration varying of mixture methane with argon and neon gas are believed to study the change in electrons energy distribution function and then the change of the electrons transport parameters including the drift velocity, the mean energy, characteristics energy and diffusion coefficient. In the present work,a contemporary developed computer, simulation program known as Bolsig+ is being used for calculating the electron transport parameters.
In this paper, we used four classification methods to classify objects and compareamong these methods, these are K Nearest Neighbor's (KNN), Stochastic Gradient Descentlearning (SGD), Logistic Regression Algorithm(LR), and Multi-Layer Perceptron (MLP). Weused MCOCO dataset for classification and detection the objects, these dataset image wererandomly divided into training and testing datasets at a ratio of 7:3, respectively. In randomlyselect training and testing dataset images, converted the color images to the gray level, thenenhancement these gray images using the histogram equalization method, resize (20 x 20) fordataset image. Principal component analysis (PCA) was used for feature extraction, andfinally apply four classification metho
... Show MoreThis study includes the application of non-parametric methods in estimating the conditional survival function of the Beran method using both the Nadaraya-Waston and the Priestley-chao weights and using data for Interval censored and Right censored of breast cancer and two types of treatment, Chemotherapy and radiation therapy Considering age is continuous variable, through using (MATLAB) use of the (MSE) To compare weights The results showed a superior weight (Nadaraya-Waston) in estimating the survival function and condition of Both for chemotherapy and radiation therapy.