Feature extraction provide a quick process for extracting object from remote sensing data (images) saving time to urban planner or GIS user from digitizing hundreds of time by hand. In the present work manual, rule based, and classification methods have been applied. And using an object- based approach to classify imagery. From the result, we obtained that each method is suitable for extraction depending on the properties of the object, for example, manual method is convenient for object, which is clear, and have sufficient area, also choosing scale and merge level have significant effect on the classification process and the accuracy of object extraction. Also from the results the rule-based method is more suitable method for extracting most features, since it depends on different variable which belong to the objects.
Conditional logistic regression is often used to study the relationship between event outcomes and specific prognostic factors in order to application of logistic regression and utilizing its predictive capabilities into environmental studies. This research seeks to demonstrate a novel approach of implementing conditional logistic regression in environmental research through inference methods predicated on longitudinal data. Thus, statistical analysis of longitudinal data requires methods that can properly take into account the interdependence within-subjects for the response measurements. If this correlation ignored then inferences such as statistical tests and confidence intervals can be invalid largely.
This research includes the use of an artificial intelligence algorithm, which is one of the algorithms of biological systems which is the algorithm of genetic regulatory networks (GRNs), which is a dynamic system for a group of variables representing space within time. To construct this biological system, we use (ODEs) and to analyze the stationarity of the model we use Euler's method. And through the factors that affect the process of gene expression in terms of inhibition and activation of the transcription process on DNA, we will use TF transcription factors. The current research aims to use the latest methods of the artificial intelligence algorithm. To apply Gene Regulation Networks (GRNs), we used a progr
... Show MoreThe purpose of this study is to diagnose factors that effect Thi-Qar behavioral intention to use internet. A sample of (127) internet users of university staff was taken in the study and were analyzed by using path analyze . The study concluded that there is a set of affecting correlation. It was founded that exogenous variables (gender, income, perceived fun, perceived usefulness, Image, and ease of use) has significant effect on endogenous (behavioral intention) . The result of analysis indicated that image hopeful gained users comes first, ease of use secondly, perceived fan and perceived usefulness on (dependent variables (daily internet usage and diversity of internet usage. Implication of these result are discussed . the st
... Show MoreIn this paper, two of the local search algorithms are used (genetic algorithm and particle swarm optimization), in scheduling number of products (n jobs) on a single machine to minimize a multi-objective function which is denoted as (total completion time, total tardiness, total earliness and the total late work). A branch and bound (BAB) method is used for comparing the results for (n) jobs starting from (5-18). The results show that the two algorithms have found the optimal and near optimal solutions in an appropriate times.
In this study, we derived the estimation for Reliability of the Exponential distribution based on the Bayesian approach. In the Bayesian approach, the parameter of the Exponential distribution is assumed to be random variable .We derived posterior distribution the parameter of the Exponential distribution under four types priors distributions for the scale parameter of the Exponential distribution is: Inverse Chi-square distribution, Inverted Gamma distribution, improper distribution, Non-informative distribution. And the estimators for Reliability is obtained using the two proposed loss function in this study which is based on the natural logarithm for Reliability function .We used simulation technique, to compare the
... Show MoreA comparison of double informative and non- informative priors assumed for the parameter of Rayleigh distribution is considered. Three different sets of double priors are included, for a single unknown parameter of Rayleigh distribution. We have assumed three double priors: the square root inverted gamma (SRIG) - the natural conjugate family of priors distribution, the square root inverted gamma – the non-informative distribution, and the natural conjugate family of priors - the non-informative distribution as double priors .The data is generating form three cases from Rayleigh distribution for different samples sizes (small, medium, and large). And Bayes estimators for the parameter is derived under a squared erro
... Show MoreIn this paper,we estimate the parameters and related probability functions, survival function, cumulative distribution function , hazard function(failure rate) and failure (death) probability function(pdf) for two parameters Birnbaum-Saunders distribution which is fitting the complete data for the patients of lymph glands cancer. Estimating the parameters (shape and scale) using (maximum likelihood , regression quantile and shrinkage) methods and then compute the value of mentioned related probability functions depending on sample from real data which describe the duration of survivor for patients who suffer from the lymph glands cancer based on diagnosis of disease or the inter of patients in a hospital for perio
... Show MoreTransforming the common normal distribution through the generated Kummer Beta model to the Kummer Beta Generalized Normal Distribution (KBGND) had been achieved. Then, estimating the distribution parameters and hazard function using the MLE method, and improving these estimations by employing the genetic algorithm. Simulation is used by assuming a number of models and different sample sizes. The main finding was that the common maximum likelihood (MLE) method is the best in estimating the parameters of the Kummer Beta Generalized Normal Distribution (KBGND) compared to the common maximum likelihood according to Mean Squares Error (MSE) and Mean squares Error Integral (IMSE) criteria in estimating the hazard function. While the pr
... Show MoreThe main problem when dealing with fuzzy data variables is that it cannot be formed by a model that represents the data through the method of Fuzzy Least Squares Estimator (FLSE) which gives false estimates of the invalidity of the method in the case of the existence of the problem of multicollinearity. To overcome this problem, the Fuzzy Bridge Regression Estimator (FBRE) Method was relied upon to estimate a fuzzy linear regression model by triangular fuzzy numbers. Moreover, the detection of the problem of multicollinearity in the fuzzy data can be done by using Variance Inflation Factor when the inputs variable of the model crisp, output variable, and parameters are fuzzed. The results were compared usin
... Show More