FG Mohammed, HM Al-Dabbas, Iraqi journal of science, 2018 - Cited by 6
In this article, we developed a new loss function, as the simplification of linear exponential loss function (LINEX) by weighting LINEX function. We derive a scale parameter, reliability and the hazard functions in accordance with upper record values of the Lomax distribution (LD). To study a small sample behavior performance of the proposed loss function using a Monte Carlo simulation, we make a comparison among maximum likelihood estimator, Bayesian estimator by means of LINEX loss function and Bayesian estimator using square error loss (SE) function. The consequences have shown that a modified method is the finest for valuing a scale parameter, reliability and hazard functions.
Crime is considered as an unlawful activity of all kinds and it is punished by law. Crimes have an impact on a society's quality of life and economic development. With a large rise in crime globally, there is a necessity to analyze crime data to bring down the rate of crime. This encourages the police and people to occupy the required measures and more effectively restricting the crimes. The purpose of this research is to develop predictive models that can aid in crime pattern analysis and thus support the Boston department's crime prevention efforts. The geographical location factor has been adopted in our model, and this is due to its being an influential factor in several situations, whether it is traveling to a specific area or livin
... Show MoreNonlinear time series analysis is one of the most complex problems ; especially the nonlinear autoregressive with exogenous variable (NARX) .Then ; the problem of model identification and the correct orders determination considered the most important problem in the analysis of time series . In this paper , we proposed splines estimation method for model identification , then we used three criterions for the correct orders determination. Where ; proposed method used to estimate the additive splines for model identification , And the rank determination depends on the additive property to avoid the problem of curse dimensionally . The proposed method is one of the nonparametric methods , and the simulation results give a
... Show MoreIn this research, Artificial Neural Networks (ANNs) technique was applied in an attempt to predict the water levels and some of the water quality parameters at Tigris River in Wasit Government for five different sites. These predictions are useful in the planning, management, evaluation of the water resources in the area. Spatial data along a river system or area at different locations in a catchment area usually have missing measurements, hence an accurate prediction. model to fill these missing values is essential.
The selected sites for water quality data prediction were Sewera, Numania , Kut u/s, Kut d/s, Garaf observation sites. In these five sites models were built for prediction of the water level and water quality parameters.
Contents IJPAM: Volume 116, No. 3 (2017)
Recently, Image enhancement techniques can be represented as one of the most significant topics in the field of digital image processing. The basic problem in the enhancement method is how to remove noise or improve digital image details. In the current research a method for digital image de-noising and its detail sharpening/highlighted was proposed. The proposed approach uses fuzzy logic technique to process each pixel inside entire image, and then take the decision if it is noisy or need more processing for highlighting. This issue is performed by examining the degree of association with neighboring elements based on fuzzy algorithm. The proposed de-noising approach was evaluated by some standard images after corrupting them with impulse
... Show MoreIn this paper, an algorithm through which we can embed more data than the
regular methods under spatial domain is introduced. We compressed the secret data
using Huffman coding and then this compressed data is embedded using laplacian
sharpening method.
We used Laplace filters to determine the effective hiding places, then based on
threshold value we found the places with the highest values acquired from these filters
for embedding the watermark. In this work our aim is increasing the capacity of
information which is to be embedded by using Huffman code and at the same time
increasing the security of the algorithm by hiding data in the places that have highest
values of edges and less noticeable.
The perform