The objective of the study is to demonstrate the predictive ability is better between the logistic regression model and Linear Discriminant function using the original data first and then the Home vehicles to reduce the dimensions of the variables for data and socio-economic survey of the family to the province of Baghdad in 2012 and included a sample of 615 observation with 13 variable, 12 of them is an explanatory variable and the depended variable is number of workers and the unemployed.
Was conducted to compare the two methods above and it became clear by comparing the logistic regression model best of a Linear Discriminant function written
... Show MoreThe modern systems that have been based upon the hash function are more suitable compared to the conventional systems; however, the complicated algorithms for the generation of the invertible functions have a high level of time consumption. With the use of the GAs, the key strength is enhanced, which results in ultimately making the entire algorithm sufficient. Initially, the process of the key generation is performed by using the results of n-queen problem that is solved by the genetic algorithm, with the use of a random number generator and through the application of the GA operations. Ultimately, the encryption of the data is performed with the use of the Modified Reverse Encryption Algorithm (MREA). It was noticed that the
... Show MoreBackground: This study aimed to determine the cephalometric values of tetragon analysis on a sample of Iraqi adults with normal occlusion. Material and methods: Forty digital true lateral cephalometric radiographs belong to 20 males and 20 females having normal dental relation were analyzed using AutoCAD program 2009. Descriptive statistics and sample comparison with Fastlicht norms were obtained. Results: The results showed that maxillary and mandibular incisors were more proclined and the maxillary/mandibular planes angle was lower in Iraqi sample than Caucasian sample. Conclusion: It's recommended to use result from this study when using tetragon analysis for Iraqis to get more accurate result.
Abstract
The research Compared two methods for estimating fourparametersof the compound exponential Weibull - Poisson distribution which are the maximum likelihood method and the Downhill Simplex algorithm. Depending on two data cases, the first one assumed the original data (Non-polluting), while the second one assumeddata contamination. Simulation experimentswere conducted for different sample sizes and initial values of parameters and under different levels of contamination. Downhill Simplex algorithm was found to be the best method for in the estimation of the parameters, the probability function and the reliability function of the compound distribution in cases of natural and contaminateddata.
... Show More
Suppose that is a finite group and is a non-empty subset of such that and . Suppose that is the Cayley graph whose vertices are all elements of and two vertices and are adjacent if and only if . In this paper, we introduce the generalized Cayley graph denoted by that is a graph with vertex set consists of all column matrices which all components are in and two vertices and are adjacent if and only if , where is a column matrix that each entry is the inverse of similar entry of and is matrix with all entries in , is the transpose of and . In this paper, we clarify some basic properties of the new graph and assign the structure of when is complete graph , complete bipartite graph and complete
... Show MoreStoring, transferring, and processing high-dimensional electroencephalogram (EGG) signals is a critical challenge. The goal of EEG compression is to remove redundant data in EEG signals. Medical signals like EEG must be of high quality for medical diagnosis. This paper uses a compression system with near-zero Mean Squared Error (MSE) based on Discrete Cosine Transform (DCT) and double shift coding for fast and efficient EEG data compression. This paper investigates and compares the use or non-use of delta modulation, which is applied to the transformed and quantized input signal. Double shift coding is applied after mapping the output to positive as a final step. The system performance is tested using EEG data files from the C
... Show MoreRivest Cipher 4 (RC4) is an efficient stream cipher that is commonly used in internet protocols. However, there are several flaws in the key scheduling algorithm (KSA) of RC4. The contribution of this paper is to overcome some of these weaknesses by proposing a new version of KSA coined as modified KSA . In the initial state of the array is suggested to contain random values instead of the identity permutation. Moreover, the permutation of the array is modified to depend on the key value itself. The proposed performance is assessed in terms of cipher secrecy, randomness test and time under a set of experiments with variable key size and different plaintext size. The results show that the RC4 with improves the randomness and secrecy with
... Show MoreIn this research, we propose to use two local search methods (LSM's); Particle Swarm Optimization (PSO) and the Bees Algorithm (BA) to solve Multi-Criteria Travelling Salesman Problem (MCTSP) to obtain the best efficient solutions. The generating process of the population of the proposed LSM's may be randomly obtained or by adding some initial solutions obtained from some efficient heuristic methods. The obtained solutions of the PSO and BA are compared with the solutions of the exact methods (complete enumeration and branch and bound methods) and some heuristic methods. The results proved the efficiency of PSO and BA methods for a large number of nodes ( ). The proposed LSM's give the best efficient solutions for the MCTSP for
... Show MoreMany consumers of electric power have excesses in their electric power consumptions that exceed the permissible limit by the electrical power distribution stations, and then we proposed a validation approach that works intelligently by applying machine learning (ML) technology to teach electrical consumers how to properly consume without wasting energy expended. The validation approach is one of a large combination of intelligent processes related to energy consumption which is called the efficient energy consumption management (EECM) approaches, and it connected with the internet of things (IoT) technology to be linked to Google Firebase Cloud where a utility center used to check whether the consumption of the efficient energy is s
... Show More