Preferred Language
Articles
/
bsj-3537
DEO: A Dynamic Event Order Strategy for t-way Sequence Covering Array Test Data Generation
...Show More Authors

Sequence covering array (SCA) generation is an active research area in recent years. Unlike the sequence-less covering arrays (CA), the order of sequence varies in the test case generation process. This paper reviews the state-of-the-art of the SCA strategies, earlier works reported that finding a minimal size of a test suite is considered as an NP-Hard problem. In addition, most of the existing strategies for SCA generation have a high order of complexity due to the generation of all combinatorial interactions by adopting one-test-at-a-time fashion. Reducing the complexity by adopting one-parameter- at-a-time for SCA generation is a challenging process. In addition, this reduction facilitates the supporting for a higher strength of coverage. Motivated by such challenge, this paper proposes a novel SCA strategy called Dynamic Event Order (DEO), in which the test case generation is done using one-parameter-at-a-time fashion. The details of the DEO are presented with a step-by-step example to demonstrate the behavior and show the correctness of the proposed strategy. In addition, this paper makes a comparison with existing computational strategies. The practical results demonstrate that the proposed DEO strategy outperforms the existing strategies in term of minimal test size in most cases. Moreover, the significance of the DEO increases as the number of sequences increases and/ or the strength of coverage increases. Furthermore, the proposed DEO strategy succeeds to generate SCAs up to t=7. Finally, the DEO strategy succeeds to find new upper bounds for SCA. In fact, the proposed strategy can act as a research vehicle for variants future implementation.

Scopus Clarivate Crossref
View Publication Preview PDF
Quick Preview PDF
Publication Date
Mon Feb 14 2022
Journal Name
Journal Of Educational And Psychological Researches
Comparison between Rush Model Parameters to Completed and Lost Data by Different Methods of Processing Missing Data
...Show More Authors

The current study aims to compare between the assessments of the Rush model’s parameters to the missing and completed data in various ways of processing the missing data. To achieve the aim of the present study, the researcher followed the following steps: preparing Philip Carter test for the spatial capacity which consists of (20) items on a group of (250) sixth scientific stage students in the directorates of Baghdad Education at Al–Rusafa (1st, 2nd and 3rd) for the academic year (2018-2019). Then, the researcher relied on a single-parameter model to analyze the data. The researcher used Bilog-mg3 model to check the hypotheses, data and match them with the model. In addition

... Show More
View Publication Preview PDF
Publication Date
Fri Aug 30 2024
Journal Name
Iraqi Journal Of Science
The Dissipation of the Kinetic Energy for 2D Bounded Flow by Using Moment-Based Boundary Conditions with Burnett Order Stress for LBM
...Show More Authors

     In this article, the lattice Boltzmann method with two relaxation time  (TRT)  for the  D2Q9 model is used to investigate numerical results for 2D flow. The problem is performed to show the dissipation of the kinetic energy rate and its relationship with the enstrophy growth for 2D dipole wall collision. The investigation is carried out for normal collision and oblique incidents at an angle of . We prove the accuracy of moment -based boundary conditions with slip and Navier-Maxwell slip conditions to simulate this flow. These conditions are under the effect of Burnett-order stress conditions that are consistent with the discrete Boltzmann equation. Stable results are found by using this kind of boundary condition where d

... Show More
Scopus Crossref
Publication Date
Mon May 11 2020
Journal Name
Baghdad Science Journal
Proposing Robust LAD-Atan Penalty of Regression Model Estimation for High Dimensional Data
...Show More Authors

         The issue of penalized regression model has received considerable critical attention to variable selection. It plays an essential role in dealing with high dimensional data. Arctangent denoted by the Atan penalty has been used in both estimation and variable selection as an efficient method recently. However, the Atan penalty is very sensitive to outliers in response to variables or heavy-tailed error distribution. While the least absolute deviation is a good method to get robustness in regression estimation. The specific objective of this research is to propose a robust Atan estimator from combining these two ideas at once. Simulation experiments and real data applications show that the proposed LAD-Atan estimator

... Show More
View Publication Preview PDF
Crossref (1)
Crossref
Publication Date
Sun Apr 30 2023
Journal Name
Iraqi Geological Journal
Evaluating Machine Learning Techniques for Carbonate Formation Permeability Prediction Using Well Log Data
...Show More Authors

Machine learning has a significant advantage for many difficulties in the oil and gas industry, especially when it comes to resolving complex challenges in reservoir characterization. Permeability is one of the most difficult petrophysical parameters to predict using conventional logging techniques. Clarifications of the work flow methodology are presented alongside comprehensive models in this study. The purpose of this study is to provide a more robust technique for predicting permeability; previous studies on the Bazirgan field have attempted to do so, but their estimates have been vague, and the methods they give are obsolete and do not make any concessions to the real or rigid in order to solve the permeability computation. To

... Show More
View Publication
Scopus (11)
Crossref (6)
Scopus Crossref
Publication Date
Sat Aug 01 2015
Journal Name
Journal Of Engineering
Analytical Approach for Load Capacity of Large Diameter Bored Piles Using Field Data
...Show More Authors

An analytical approach based on field data was used to determine the strength capacity of large diameter bored type piles. Also the deformations and settlements were evaluated for both vertical and lateral loadings. The analytical predictions are compared to field data obtained from a proto-type test pile used at Tharthar –Tigris canal Bridge. They were found to be with acceptable agreement of 12% deviation.

               Following ASTM standards D1143M-07e1,2010, a test schedule of five loading cycles were proposed for vertical loads and series of cyclic loads to simulate horizontal loading .The load test results and analytical data of 1.95

... Show More
View Publication Preview PDF
Publication Date
Mon Apr 11 2011
Journal Name
Icgst
Employing Neural Network and Naive Bayesian Classifier in Mining Data for Car Evaluation
...Show More Authors

In data mining, classification is a form of data analysis that can be used to extract models describing important data classes. Two of the well known algorithms used in data mining classification are Backpropagation Neural Network (BNN) and Naïve Bayesian (NB). This paper investigates the performance of these two classification methods using the Car Evaluation dataset. Two models were built for both algorithms and the results were compared. Our experimental results indicated that the BNN classifier yield higher accuracy as compared to the NB classifier but it is less efficient because it is time-consuming and difficult to analyze due to its black-box implementation.

Publication Date
Fri Sep 30 2022
Journal Name
Journal Of Economics And Administrative Sciences
Semi parametric Estimators for Quantile Model via LASSO and SCAD with Missing Data
...Show More Authors

In this study, we made a comparison between LASSO & SCAD methods, which are two special methods for dealing with models in partial quantile regression. (Nadaraya & Watson Kernel) was used to estimate the non-parametric part ;in addition, the rule of thumb method was used to estimate the smoothing bandwidth (h). Penalty methods proved to be efficient in estimating the regression coefficients, but the SCAD method according to the mean squared error criterion (MSE) was the best after estimating the missing data using the mean imputation method

View Publication Preview PDF
Crossref
Publication Date
Wed Jul 01 2020
Journal Name
Indonesian Journal Of Electrical Engineering And Computer Science
Fast and robust approach for data security in communication channel using pascal matrix
...Show More Authors

This paper present the fast and robust approach of English text encryption and decryption based on Pascal matrix. The technique of encryption the Arabic or English text or both and show the result when apply this method on plain text (original message) and how will form the intelligible plain text to be unintelligible plain text in order to secure information from unauthorized access and from steel information, an encryption scheme usually uses a pseudo-random enecryption key generated by an algorithm. All this done by using Pascal matrix. Encryption and decryption are done by using MATLAB as programming language and notepad ++to write the input text.This paper present the fast and robust approach of English text encryption and decryption b

... Show More
View Publication
Scopus (7)
Crossref (2)
Scopus Crossref
Publication Date
Sat Dec 30 2023
Journal Name
Journal Of Economics And Administrative Sciences
The Cluster Analysis by Using Nonparametric Cubic B-Spline Modeling for Longitudinal Data
...Show More Authors

Longitudinal data is becoming increasingly common, especially in the medical and economic fields, and various methods have been analyzed and developed to analyze this type of data.

In this research, the focus was on compiling and analyzing this data, as cluster analysis plays an important role in identifying and grouping co-expressed subfiles over time and employing them on the nonparametric smoothing cubic B-spline model, which is characterized by providing continuous first and second derivatives, resulting in a smoother curve with fewer abrupt changes in slope. It is also more flexible and can pick up on more complex patterns and fluctuations in the data.

The longitudinal balanced data profile was compiled into subgroup

... Show More
View Publication Preview PDF
Crossref
Publication Date
Wed Jun 28 2023
Journal Name
Internet Technology Letters,
The Blockchain for Healthcare 4.0 Apply in Standard Secure Medical Data Processing Architecture
...Show More Authors

Cloud-based Electronic Health Records (EHRs) have seen a substantial increase in usage in recent years, especially for remote patient monitoring. Researchers are interested in investigating the use of Healthcare 4.0 in smart cities. This involves using Internet of Things (IoT) devices and cloud computing to remotely access medical processes. Healthcare 4.0 focuses on the systematic gathering, merging, transmission, sharing, and retention of medical information at regular intervals. Protecting the confidential and private information of patients presents several challenges in terms of thwarting illegal intrusion by hackers. Therefore, it is essential to prioritize the protection of patient medical data that is stored, accessed, and shared on

... Show More
View Publication
Crossref (1)
Scopus Clarivate Crossref