This research deals with a shrinking method concerned with the principal components similar to that one which used in the multiple regression “Least Absolute Shrinkage and Selection: LASS”. The goal here is to make an uncorrelated linear combinations from only a subset of explanatory variables that may have a multicollinearity problem instead taking the whole number say, (K) of them. This shrinkage will force some coefficients to equal zero, after making some restriction on them by some "tuning parameter" say, (t) which balances the bias and variance amount from side, and doesn't exceed the acceptable percent explained variance of these components. This had been shown by MSE criterion in the regression case and the percent explained variance in the principal component case.
Cancer is one of the dangerous diseases that afflict a person through injury to cells and tissues in the body, where a person is vulnerable to infection in any age group, and it is not easy to control and multiply between cells and spread to the body. In spite of the great progress in medical studies interested in this aspect, the options for those with this disease are few and difficult, as they require significant financial costs for health services and for treatment that is difficult to provide.
This study dealt with the determinants of liver cancer by relying on the data of cancerous tumours taken from the Iraqi Center for Oncology in the Ministry of Health 2017. Survival analysis has been used as a m
... Show MoreThe advancements in Information and Communication Technology (ICT), within the previous decades, has significantly changed people’s transmit or store their information over the Internet or networks. So, one of the main challenges is to keep these information safe against attacks. Many researchers and institutions realized the importance and benefits of cryptography in achieving the efficiency and effectiveness of various aspects of secure communication.This work adopts a novel technique for secure data cryptosystem based on chaos theory. The proposed algorithm generate 2-Dimensional key matrix having the same dimensions of the original image that includes random numbers obtained from the 1-Dimensional logistic chaotic map for given con
... Show MoreThis paper deals the prediction of the process of random spatial data of two properties, the first is called Primary variables and the second is called secondary variables , the method that were used in the prediction process for this type of data is technique Co-kriging , the method is usually used when the number of primary variables meant to predict for one of its elements is measured in a particular location a few (because of the cost or difficulty of obtaining them) compare with secondary variable which is the number of elements are available and highly correlated with primary variables, as was the&nbs
... Show MoreBackground: Machine learning relies on a hybrid of analytics, including regression analyses. There have been no attempts to deploy a sinusoidal transformation of data to enhance linear regression models.
Objectives: We aim to optimize linear models by implementing sinusoidal transformation to minimize the sum of squared error.
Methods: We implemented non-Bayesian statistics using SPSS and MatLab. We used Excel to generate 30 trials of linear regression models, and each has 1,000 observations. We utilized SPSS linear regression, Wilcoxon signed-rank test, and Cronbach’s alpha statistics to evaluate the performance of the optimization model. Results: The sinusoidal
A new approach for baud time (or baud rate) estimation of a random binary signal is presented. This approach utilizes the spectrum of the signal after nonlinear processing in a way that the estimation error can be reduced by simply increasing the number of the processed samples instead of increasing the sampling rate. The spectrum of the new signal is shown to give an accurate estimate about the baud time when there is no apriory information or any restricting preassumptions. The performance of the estimator for random binary square waves perturbed by white Gaussian noise and ISI is evaluated and compared with that of the conventional estimator of the zero crossing detector.
Abstract
The following research is marked by "social intelligence and its role in demonstration the potential abilities for individuals." The discussion dealt with the concepts of contemporary is very important because of their significant role in influencing the work of the Organization, as adopted link between the concepts of social intelligence and the potential role of the first to show the second .The research hypotheses tested in three health institutions in the city of Mosul, the research community is represented (Al-Salam Hospital and General Hospital and the son of ether), while the sample were the leaders of these institutio
... Show MoreThis paper is concerned with pre-test single and double stage shrunken estimators for the mean (?) of normal distribution when a prior estimate (?0) of the actule value (?) is available, using specifying shrinkage weight factors ?(?) as well as pre-test region (R). Expressions for the Bias [B(?)], mean squared error [MSE(?)], Efficiency [EFF(?)] and Expected sample size [E(n/?)] of proposed estimators are derived. Numerical results and conclusions are drawn about selection different constants included in these expressions. Comparisons between suggested estimators, with respect to classical estimators in the sense of Bias and Relative Efficiency, are given. Furthermore, comparisons with the earlier existing works are drawn.
Steganography is a technique of concealing secret data within other quotidian files of the same or different types. Hiding data has been essential to digital information security. This work aims to design a stego method that can effectively hide a message inside the images of the video file. In this work, a video steganography model has been proposed through training a model to hiding video (or images) within another video using convolutional neural networks (CNN). By using a CNN in this approach, two main goals can be achieved for any steganographic methods which are, increasing security (hardness to observed and broken by used steganalysis program), this was achieved in this work as the weights and architecture are randomized. Thus,
... Show MoreThe logistic regression model regarded as the important regression Models ,where of the most interesting subjects in recent studies due to taking character more advanced in the process of statistical analysis .
The ordinary estimating methods is failed in dealing with data that consist of the presence of outlier values and hence on the absence of such that have undesirable effect on the result. &nbs
... Show MoreThis research aims to analyze and simulate biochemical real test data for uncovering the relationships among the tests, and how each of them impacts others. The data were acquired from Iraqi private biochemical laboratory. However, these data have many dimensions with a high rate of null values, and big patient numbers. Then, several experiments have been applied on these data beginning with unsupervised techniques such as hierarchical clustering, and k-means, but the results were not clear. Then the preprocessing step performed, to make the dataset analyzable by supervised techniques such as Linear Discriminant Analysis (LDA), Classification And Regression Tree (CART), Logistic Regression (LR), K-Nearest Neighbor (K-NN), Naïve Bays (NB
... Show More