In this paper we proposed a new method for selecting a smoothing parameter in kernel estimator to estimate a nonparametric regression function in the presence of missing values. The proposed method is based on work on the golden ratio and Surah AL-E-Imran in the Qur'an. Simulation experiments were conducted to study a small sample behavior. The results proved the superiority the proposed on the competition method for selecting smoothing parameter.
Abstract
The problem of missing data represents a major obstacle before researchers in the process of data analysis in different fields since , this problem is a recurrent one in all fields of study including social , medical , astronomical and clinical experiments .
The presence of such a problem within the data to be studied may influence negatively on the analysis and it may lead to misleading conclusions , together with the fact that these conclusions that result from a great bias caused by that problem in spite of the efficiency of wavelet methods but they are also affected by the missing of data , in addition to the impact of the problem of miss of accuracy estimation
... Show MoreThe key objective of the study is to understand the best processes that are currently used in managing talent in Australian higher education (AHE) and design a quantitative measurement of talent management processes (TMPs) for the higher education (HE) sector.
The three qualitative multi-method studies that are commonly used in empirical studies, namely, brainstorming, focus group discussions and semi-structured individual interviews were considered. Twenty
In this paper, we describe a new method for image denoising. We analyze properties of the Multiwavelet coefficients of natural images. Also it suggests a method for computing the Multiwavelet transform using the 1st order approximation. This paper describes a simple and effective model for noise removal through suggesting a new technique for retrieving the image by allowing us to estimate it from the noisy image. The proposed algorithm depends on mixing both soft-thresholds with Mean filter and applying concurrently on noisy image by dividing into blocks of equal size (for concurrent processed to increase the performance of the enhancement process and to decease the time that is needed for implementation by applying the proposed algorith
... Show MoreIn this research, we use fuzzy nonparametric methods based on some smoothing techniques, were applied to real data on the Iraqi stock market especially the data about Baghdad company for soft drinks for the year (2016) for the period (1/1/2016-31/12/2016) .A sample of (148) observations was obtained in order to construct a model of the relationship between the stock prices (Low, high, modal) and the traded value by comparing the results of the criterion (G.O.F.) for three techniques , we note that the lowest value for this criterion was for the K-Nearest Neighbor at Gaussian function .
Survival analysis is widely applied in data describing for the life time of item until the occurrence of an event of interest such as death or another event of understudy . The purpose of this paper is to use the dynamic approach in the deep learning neural network method, where in this method a dynamic neural network that suits the nature of discrete survival data and time varying effect. This neural network is based on the Levenberg-Marquardt (L-M) algorithm in training, and the method is called Proposed Dynamic Artificial Neural Network (PDANN). Then a comparison was made with another method that depends entirely on the Bayes methodology is called Maximum A Posterior (MAP) method. This method was carried out using numerical algorithms re
... Show MoreMedian filter is adopted to match the noise statistics of the degradation seeking good quality smoothing images. Two methods are suggested in this paper(Pentagonal-Hexagonal mask and Scan Window Mask), the study involved modified median filter for improving noise suppression, the modification is considered toward more reliable results. Modification median filter (Pentagonal-Hexagonal mask) was found gave better results (qualitatively and quantitatively ) than classical median filters and another suggested method (Scan Window Mask), but this will be on the account of the time required. But sometimes when the noise is line type the cross 3x3 filter preferred to another one Pentagonal-Hexagonal with few variation. Scan Window Mask gave bett
... Show MoreConcurrently with the technological development that the world is witnessing the crime of money laundering to evolve faster and with multiple methods and its economic, political and social impacts raised increasingly. And for phenomenon dangerous the international community in recent years is keen to be considered combating money laundering as a general indication whereby verification of the international response the stats and its banks and financial institutions with international requirements mandated in this aspect, so the increasing interest the governments of countries in the laws and procedures that contribute to the reduction of the phenomenon of money laundering and avoid legislation economy and the banking and financial sectors
... Show MoreThe issue of image captioning, which comprises automatic text generation to understand an image’s visual information, has become feasible with the developments in object recognition and image classification. Deep learning has received much interest from the scientific community and can be very useful in real-world applications. The proposed image captioning approach involves the use of Convolution Neural Network (CNN) pre-trained models combined with Long Short Term Memory (LSTM) to generate image captions. The process includes two stages. The first stage entails training the CNN-LSTM models using baseline hyper-parameters and the second stage encompasses training CNN-LSTM models by optimizing and adjusting the hyper-parameters of
... Show MoreEffective decision-making process is the basis for successfully solving any engineering problem. Many decisions taken in the construction projects differ in their nature due to the complex nature of the construction projects. One of the most crucial decisions that might result in numerous issues over the course of a construction project is the selection of the contractor. This study aims to use the ordinal priority approach (OPA) for the contractor selection process in the construction industry. The proposed model involves two computer programs; the first of these will be used to evaluate the decision-makers/experts in the construction projects, while the second will be used to formul