Variable selection is an essential and necessary task in the statistical modeling field. Several studies have triedto develop and standardize the process of variable selection, but it isdifficultto do so. The first question a researcher needs to ask himself/herself what are the most significant variables that should be used to describe a given dataset’s response. In thispaper, a new method for variable selection using Gibbs sampler techniqueshas beendeveloped.First, the model is defined, and the posterior distributions for all the parameters are derived.The new variable selection methodis tested usingfour simulation datasets. The new approachiscompared with some existingtechniques: Ordinary Least Squared (OLS), Least Absolute Shrinkage and Selection Operator (Lasso), and Tikhonov Regularization (Ridge). The simulation studiesshow that the performance of our method is better than the othersaccording to the error and the time complexity. Thesemethodsare applied to a real dataset, which is called Rock StrengthDataset.The new approach implemented using the Gibbs sampler is more powerful and effective than other approaches.All the statistical computations conducted for this paper are done using R version 4.0.3 on a single processor computer.
The aim of the research is to examine the multiple intelligence test item selection based on Howard Gardner's MI model using the Generalized Partial Estimation Form, generalized intelligence. The researcher adopted the scale of multiple intelligences by Kardner, it consists of (102) items with eight sub-scales. The sample consisted of (550) students from Baghdad universities, Technology University, al-Mustansiriyah university, and Iraqi University for the academic year (2019/2020). It was verified assumptions theory response to a single (one-dimensional, local autonomy, the curve of individual characteristics, speed factor and application), and analysis of the data according to specimen partial appreciation of the generalized, and limits
... Show MoreAstronomy image is regarded main source of information to discover outer space, therefore to know the basic contain for galaxy (Milky way), it was classified using Variable Precision Rough Sets technique to determine the different region within galaxy according different color in the image. From classified image we can determined the percentage for each class and then what is the percentage mean. In this technique a good classified image result and faster time required to done the classification process.
The development of information systems in recent years has contributed to various methods of gathering information to evaluate IS performance. The most common approach used to collect information is called the survey system. This method, however, suffers one major drawback. The decision makers consume considerable time to transform data from survey sheets to analytical programs. As such, this paper proposes a method called ‘survey algorithm based on R programming language’ or SABR, for data transformation from the survey sheets inside R environments by treating the arrangement of data as a relational format. R and Relational data format provide excellent opportunity to manage and analyse the accumulated data. Moreover, a survey syste
... Show MoreCloth simulation and animation has been the topic of research since the mid-80's in the field of computer graphics. Enforcing incompressible is very important in real time simulation. Although, there are great achievements in this regard, it still suffers from unnecessary time consumption in certain steps that is common in real time applications. This research develops a real-time cloth simulator for a virtual human character (VHC) with wearable clothing. This research achieves success in cloth simulation on the VHC through enhancing the position-based dynamics (PBD) framework by computing a series of positional constraints which implement constant densities. Also, the self-collision and collision wit
... Show More<p><span>A Botnet is one of many attacks that can execute malicious tasks and develop continuously. Therefore, current research introduces a comparison framework, called BotDetectorFW, with classification and complexity improvements for the detection of Botnet attack using CICIDS2017 dataset. It is a free online dataset consist of several attacks with high-dimensions features. The process of feature selection is a significant step to obtain the least features by eliminating irrelated features and consequently reduces the detection time. This process implemented inside BotDetectorFW using two steps; data clustering and five distance measure formulas (cosine, dice, driver & kroeber, overlap, and pearson correlation
... Show MoreThis paper proposes a new encryption method. It combines two cipher algorithms, i.e., DES and AES, to generate hybrid keys. This combination strengthens the proposed W-method by generating high randomized keys. Two points can represent the reliability of any encryption technique. Firstly, is the key generation; therefore, our approach merges 64 bits of DES with 64 bits of AES to produce 128 bits as a root key for all remaining keys that are 15. This complexity increases the level of the ciphering process. Moreover, it shifts the operation one bit only to the right. Secondly is the nature of the encryption process. It includes two keys and mixes one round of DES with one round of AES to reduce the performance time. The W-method deals with
... Show MoreThis research aims to analyze and simulate biochemical real test data for uncovering the relationships among the tests, and how each of them impacts others. The data were acquired from Iraqi private biochemical laboratory. However, these data have many dimensions with a high rate of null values, and big patient numbers. Then, several experiments have been applied on these data beginning with unsupervised techniques such as hierarchical clustering, and k-means, but the results were not clear. Then the preprocessing step performed, to make the dataset analyzable by supervised techniques such as Linear Discriminant Analysis (LDA), Classification And Regression Tree (CART), Logistic Regression (LR), K-Nearest Neighbor (K-NN), Naïve Bays (NB
... Show MoreAbstract
This study aims at identifying the impact of the application of IFRS 15 "Revenue from contracts with customers on the quality of financial reporting, through application to faculty members in the accounting departments of Iraqi universities and auditors. The problem of the study was the multiplicity of accounting rules and standards Which deals with the issues of revenue recognition , as well as the lack of consistency of most of them with the common framework of financial accounting, which results in low quality of financial reporting in the current financial statements, where the formulation of one hypothesis was the lack of relationship of significant significance The application of IFRS 15 "Recognition of rev
... Show MoreThere is no doubt though all endemic industrial must be accompanied by environmental contamination problem that is closely linked increased industrial activity on the one hand and increase the size of the problem put this industrial waste on the other, and therefore the risk to natural resources and ecosystems as a result of the presence of development projects (especially industrial ones) so it was There must be a sound footing for the sitting of industrial zones and see how their commitment to the appropriate conditions to preserve the environment and enact strict laws to achieve this and prepared well prepared to avoid the numerous errors at the social, economic and technical, environmental and health.
The surface finish of the machining part is the mostly important characteristics of products quality and its indispensable customers’ requirement. Taguchi robust parameters designs for optimizing for surface finish in turning of 7025 AL-Alloy using carbide cutting tool has been utilized in this paper. Three machining variables namely; the machining speeds (1600, 1900, and 2200) rpm, depth of cut (0.25, 0.50, 0.75) mm and the feed rates (0.12, 0.18, 0.24) mm/min utilized in the experiments. The other variables were considered as constants. The mean surface finish was utilized as a measuring of surface quality. The results clarified that increasing the speeds reduce the surface roughness, while it rises with increasing the depths and fee
... Show More