In this paper, a new method of selection variables is presented to select some essential variables from large datasets. The new model is a modified version of the Elastic Net model. The modified Elastic Net variable selection model has been summarized in an algorithm. It is applied for Leukemia dataset that has 3051 variables (genes) and 72 samples. In reality, working with this kind of dataset is not accessible due to its large size. The modified model is compared to some standard variable selection methods. Perfect classification is achieved by applying the modified Elastic Net model because it has the best performance. All the calculations that have been done for this paper are in R program by using some existing packages.
This research includes the study of dual data models with mixed random parameters, which contain two types of parameters, the first is random and the other is fixed. For the random parameter, it is obtained as a result of differences in the marginal tendencies of the cross sections, and for the fixed parameter, it is obtained as a result of differences in fixed limits, and random errors for each section. Accidental bearing the characteristic of heterogeneity of variance in addition to the presence of serial correlation of the first degree, and the main objective in this research is the use of efficient methods commensurate with the paired data in the case of small samples, and to achieve this goal, the feasible general least squa
... Show MoreThis research aims to study the methods of reduction of dimensions that overcome the problem curse of dimensionality when traditional methods fail to provide a good estimation of the parameters So this problem must be dealt with directly . Two methods were used to solve the problem of high dimensional data, The first method is the non-classical method Slice inverse regression ( SIR ) method and the proposed weight standard Sir (WSIR) method and principal components (PCA) which is the general method used in reducing dimensions, (SIR ) and (PCA) is based on the work of linear combinations of a subset of the original explanatory variables, which may suffer from the problem of heterogeneity and the problem of linear
... Show MoreThis research is concerned with the re-analysis of optical data (the imaginary part of the dielectric function as a function of photon energy E) of a-Si:H films prepared by Jackson et al. and Ferlauto et al. through using nonlinear regression fitting we estimated the optical energy gap and the deviation from the Tauc model by considering the parameter of energy photon-dependence of the momentum matrix element of the p as a free parameter by assuming that density of states distribution to be a square root function. It is observed for films prepared by Jackson et al. that the value of the parameter p for the photon energy range is is close to the value assumed by the Cody model and the optical gap energy is which is also close to the value
... Show MoreAbstract
The problem of missing data represents a major obstacle before researchers in the process of data analysis in different fields since , this problem is a recurrent one in all fields of study including social , medical , astronomical and clinical experiments .
The presence of such a problem within the data to be studied may influence negatively on the analysis and it may lead to misleading conclusions , together with the fact that these conclusions that result from a great bias caused by that problem in spite of the efficiency of wavelet methods but they are also affected by the missing of data , in addition to the impact of the problem of miss of accuracy estimation
... Show MoreAbstract
In this research provide theoretical aspects of one of the most important statistical distributions which it is Lomax, which has many applications in several areas, set of estimation methods was used(MLE,LSE,GWPM) and compare with (RRE) estimation method ,in order to find out best estimation method set of simulation experiment (36) with many replications in order to get mean square error and used it to make compare , simulation experiment contrast with (estimation method, sample size ,value of location and shape parameter) results show that estimation method effected by simulation experiment factors and ability of using other estimation methods such as(Shrinkage, jackknif
... Show MoreObjective : Multiple sclerosis (MS) is a common neurological disease deeply linked with the immune-inflammatory disorders whereas the term (multiple) mostly refers to the multi-focal zones of Inflammation caused by lymphocytes and macrophages infiltration besides oligodendrocytes death. Accordingly , the dysfunctional immune system able to damage myelin ( a pivotal component of the central nervous system ) which responsible for communication among neurons. The aim of the present study is to innovate a biochemical relationship between MS and thyroid hormones (THs) by highlighting immunological responses and also to examine the action of Interferon beta (IFNβ) drug on thyroid hormone (THs) and thyroid stimulation hormone (TSH). Materials and
... Show MoreA batch and flow injection (FI) spectrophotometric methods are described for the determination of barbituric acid in aqueous and urine samples. The method is based on the oxidative coupling reaction of barbituric acid with 4-aminoantipyrine and potassium iodate to form purple water soluble stable product at λ 510 nm. Good linearity for both methods was obtained ranging from 2 to 60 μg mL−1, 5–100 μg mL−1 for batch and FI techniques, respectively. The limit of detection (signal/noise = 3) of 0.45 μg mL−1 for batch method and 0.48 μg mL−1 for FI analysis was obtained. The proposed methods were applied successfully for the determination of barbituric acid in tap water, river water, and urine samples with good recoveries of 99.92
... Show More