In high-dimensional semiparametric regression, balancing accuracy and interpretability often requires combining dimension reduction with variable selection. This study intro- duces two novel methods for dimension reduction in additive partial linear models: (i) minimum average variance estimation (MAVE) combined with the adaptive least abso- lute shrinkage and selection operator (MAVE-ALASSO) and (ii) MAVE with smoothly clipped absolute deviation (MAVE-SCAD). These methods leverage the flexibility of MAVE for sufficient dimension reduction while incorporating adaptive penalties to en- sure sparse and interpretable models. The performance of both methods is evaluated through simulations using the mean squared error and variable selection criteria, as- sessing the correct detection of zero coefficients and the false omission of nonzero coef- ficients. A practical application involving financial data from the Baghdad Soft Drinks Company demonstrates their utility in identifying key predictors of stock market value. The results indicate that MAVE-SCAD performs well in high-dimensional and complex scenarios, whereas MAVE-ALASSO is better suited to small samples, producing more parsimonious models. These results highlight the effectiveness of these two methods in addressing key challenges in semiparametric modeling
In this paper, the theoretical cross section in pre-equilibrium nuclear reaction has been studied for the reaction at energy 22.4 MeV. Ericson’s formula of partial level density PLD and their corrections (William’s correction and spin correction) have been substituted in the theoretical cross section and compared with the experimental data for nucleus. It has been found that the theoretical cross section with one-component PLD from Ericson’s formula when doesn’t agree with the experimental value and when . There is little agreement only at the high value of energy range with the experimental cross section. The theoretical cross section that depends on the one-component William's formula and on-component corrected to spi
... Show MoreIn this paper, a new method of selection variables is presented to select some essential variables from large datasets. The new model is a modified version of the Elastic Net model. The modified Elastic Net variable selection model has been summarized in an algorithm. It is applied for Leukemia dataset that has 3051 variables (genes) and 72 samples. In reality, working with this kind of dataset is not accessible due to its large size. The modified model is compared to some standard variable selection methods. Perfect classification is achieved by applying the modified Elastic Net model because it has the best performance. All the calculations that have been done for this paper are in
The experimental proton resonance data for the reaction P+48Ti have been used to calculate and evaluate the level density by employed the Gaussian Orthogonal Ensemble, GOE version of RMT, Constant Temperature, CT and Back Shifted Fermi Gas, BSFG models at certain spin-parity and at different proton energies. The results of GOE model are found in agreement with other, while the level density calculated using the BSFG Model showed less values with spin dependence more than parity, due the limitation in the parameters (level density parameter, a, Energy shift parameter, E1and spin cut off parameter, σc). Also, in the CT Model the level density results depend mainly on two parameters (T and ground state back shift energy, E0), which are app
... Show MoreIn this research, some probability characteristics functions (probability density, characteristic, correlation and spectral density) are derived depending upon the smallest variance of the exact solution of supposing stochastic non-linear Fredholm integral equation of the second kind found by Adomian decomposition method (A.D.M)
Heart disease is a significant and impactful health condition that ranks as the leading cause of death in many countries. In order to aid physicians in diagnosing cardiovascular diseases, clinical datasets are available for reference. However, with the rise of big data and medical datasets, it has become increasingly challenging for medical practitioners to accurately predict heart disease due to the abundance of unrelated and redundant features that hinder computational complexity and accuracy. As such, this study aims to identify the most discriminative features within high-dimensional datasets while minimizing complexity and improving accuracy through an Extra Tree feature selection based technique. The work study assesses the efficac
... Show MoreMixed-effects conditional logistic regression is evidently more effective in the study of qualitative differences in longitudinal pollution data as well as their implications on heterogeneous subgroups. This study seeks that conditional logistic regression is a robust evaluation method for environmental studies, thru the analysis of environment pollution as a function of oil production and environmental factors. Consequently, it has been established theoretically that the primary objective of model selection in this research is to identify the candidate model that is optimal for the conditional design. The candidate model should achieve generalizability, goodness-of-fit, parsimony and establish equilibrium between bias and variab
... Show MoreIn this paper, a new approach was suggested to the method of Gauss Seidel through the controlling of equations installation before the beginning of the method in the traditional way. New structure of equations occur after the diagnosis of the variable that causes the fluctuation and the slow extract of the results, then eradicating this variable. This procedure leads to a higher accuracy and less number of steps than the old method. By using the this proposed method, there will be a possibility of solving many of divergent values equations which cannot be solved by the old style.
Optimizing the Access Point (AP) deployment is of great importance in wireless applications owing the requirement to provide efficient and cost-effective communication. Highly targeted by many researchers and academic industries, Quality of Service (QOS) is an important primary parameter and objective in mind along with AP placement and overall publishing cost. This study proposes and investigates a multi-level optimization algorithm based on Binary Particle Swarm Optimization (BPSO). It aims to an optimal multi-floor AP placement with effective coverage that makes it more capable of supporting QOS and cost effectiveness. Five pairs (coverage, AP placement) of weights, signal threshol
In this research, the multi-period probabilistic inventory model will be applied to the stores of raw materials used in the leather industry at the General Company for Leather Industries. The raw materials are:Natural leather includes cowhide, whether imported or local, buffalo leather, lamb leather, goat skin, chamois (raw materials made from natural leather), polished leather (raw materials made from natural leather), artificial leather (skai), supplements which include: (cuffs - Clocks - hands - pockets), and threads.This model was built after testing and determining the distribution of demand during the supply period (waiting period) for each material and completely independently from the rest of the materials, as none of the above mate
... Show More