The penalized least square method is a popular method to deal with high dimensional data ,where the number of explanatory variables is large than the sample size . The properties of penalized least square method are given high prediction accuracy and making estimation and variables selection
At once. The penalized least square method gives a sparse model ,that meaning a model with small variables so that can be interpreted easily .The penalized least square is not robust ,that means very sensitive to the presence of outlying observation , to deal with this problem, we can used a robust loss function to get the robust penalized least square method ,and get robust penalized estimator and
... Show MoreAnalysis the economic and financial phenomena and other requires to build the appropriate model, which represents the causal relations between factors. The operation building of the model depends on Imaging conditions and factors surrounding an in mathematical formula and the Researchers target to build that formula appropriately. Classical linear regression models are an important statistical tool, but used in a limited way, where is assumed that the relationship between the variables illustrations and response variables identifiable. To expand the representation of relationships between variables that represent the phenomenon under discussion we used Varying Coefficient Models
... Show MoreThis study discussed a biased estimator of the Negative Binomial Regression model known as (Liu Estimator), This estimate was used to reduce variance and overcome the problem Multicollinearity between explanatory variables, Some estimates were used such as Ridge Regression and Maximum Likelihood Estimators, This research aims at the theoretical comparisons between the new estimator (Liu Estimator) and the estimators
Optimizing system performance in dynamic and heterogeneous environments and the efficient management of computational tasks are crucial. This paper therefore looks at task scheduling and resource allocation algorithms in some depth. The work evaluates five algorithms: Genetic Algorithms (GA), Particle Swarm Optimization (PSO), Ant Colony Optimization (ACO), Firefly Algorithm (FA) and Simulated Annealing (SA) across various workloads achieved by varying the task-to-node ratio. The paper identifies Finish Time and Deadline as two key performance metrics for gauging the efficacy of an algorithm, and a comprehensive investigation of the behaviors of these algorithms across different workloads was carried out. Results from the experiment
... Show MoreDifferent injection material types were tried in the injection of soft clay, such as lime (L), silica fume (SF), and leycobond-h (LH). In this study, experiments were made to study the effect of injection on soft clay consolidation settlement. A sample of natural soft clayey soil was investigated in the laboratory and the sample was injected with each of the grout materials used, L, SF, L + SF, and L + SF + LH. A 20 cm3 of each slurry grout was conducted into the soil, which was compacted in California Bearing Ratio (CBR) mold and cured for 7 days, and then the sample was loaded to 80 N load by a circular steel footing 60 mm in diameter. The settlement was r
Industrial effluents loaded with heavy metals are a cause of hazards to the humans and other forms of life. Conventional approaches, such as electroplating, ion exchange, and membrane processes, are used for removal of copper, cadmium, and lead and are often cost prohibitive with low efficiency at low metal ion concentration. Biosorption can be considered as an option which has been proven as more efficient and economical for removing the mentioned metal ions. Biosorbents used are fungi, yeasts, oil palm shells, coir pith carbon, peanut husks, and olive pulp. Recently, low cost and natural products have also been researched as biosorbent. This paper presents an attempt of the potential use of Iraqi date pits and Al-Khriet (i.e. substances l
... Show MoreBecloud stupefy computing is solid buzzword in the trade. It is timeless in which the advantage seat be leveraged on sound out miserable take into consideration reducing the indict and complication of grant providers. Cloud computing promises to curtail truly and opinionated retrench and approximately specifically concede IT departments focusing on moral projects as contrasted with of misery datacenters contention, It is unconditionally with than on the up internet. Give are sundry consequences of this put together. For the actuality remodeling in turn flock cause get revenge buyer be attractive to. This implies ramble they chaperone custody of servers, they carry out software updates and assistant on the condense user pay
... Show MoreThe objective of this work is to design and implement a cryptography system that enables the sender to send message through any channel (even if this channel is insecure) and the receiver to decrypt the received message without allowing any intruder to break the system and extracting the secret information. In this work, we implement an interaction between the feedforward neural network and the stream cipher, so the secret message will be encrypted by unsupervised neural network method in addition to the first encryption process which is performed by the stream cipher method. The security of any cipher system depends on the security of the related keys (that are used by the encryption and the decryption processes) and their corresponding le
... Show More