A fault is an error that has effects on system behaviour. A software metric is a value that represents the degree to which software processes work properly and where faults are more probable to occur. In this research, we study the effects of removing redundancy and log transformation based on threshold values for identifying faults-prone classes of software. The study also contains a comparison of the metric values of an original dataset with those after removing redundancy and log transformation. E-learning and system dataset were taken as case studies. The fault ratio ranged from 1%-31% and 0%-10% for the original dataset and 1%-10% and 0%-4% after removing redundancy and log transformation, respectively. These results impacted directly the number of classes detected, which ranged between 1-20 and 1-7 for the original dataset and 1-7 and 0-3) after removing redundancy and log transformation. The Skewness of the dataset was deceased after applying the proposed model. The classified faulty classes need more attention in the next versions in order to reduce the ratio of faults or to do refactoring to increase the quality and performance of the current version of the software.
Socio-scientific issues provide a great platform to both engage students in scientific topics and assess their understanding of scientific concepts. Nancy R. Singer, Amy Lannin, Maha Kareem, William Romine, and Katie Kline report on the STEM Literacy Project, a three-year National Science Foundation grant that aimed to improve STEM teachers’ knowledge and integration of literacy in their classrooms. They describe teachers’ professional learning, scenario-based assessments and other strategies they incorporated in their STEM classrooms, and how writing enables students to understand real-world issues.
Recommendation systems are now being used to address the problem of excess information in several sectors such as entertainment, social networking, and e-commerce. Although conventional methods to recommendation systems have achieved significant success in providing item suggestions, they still face many challenges, including the cold start problem and data sparsity. Numerous recommendation models have been created in order to address these difficulties. Nevertheless, including user or item-specific information has the potential to enhance the performance of recommendations. The ConvFM model is a novel convolutional neural network architecture that combines the capabilities of deep learning for feature extraction with the effectiveness o
... Show MoreThe research aims to identify the effect of the training program that is based on integrating futuristic thinking skills with classroom interaction patterns on mathematics teachers in order to provide their students with creative solution skills. The research sample consisted of 31teachers (15 teachers for the experimental group and 16 for the control groups). The researcher developed a measure for the academic self-efficacy consisting of (39) items. Its validity, reliability, coefficient of difficulty and discriminatory power were estimated. To analyze the findings, the researcher adopted the Mann-Whitney (U) test and the effect size, and the findings were as follows: There is a statistically significant difference at the significance leve
... Show MoreThe earth's surface comprises different kinds of land cover, water resources, and soil, which create environmental factors for varied animals, plants, and humans. Knowing the significant effects of land cover is crucial for long-term development, climate change modeling, and preserving ecosystems. In this research, the Google Earth Engine platform and freely available Landsat imagery were used to investigate the impact of the expansion and degradation in urbanized areas, watersheds, and vegetative cover on the land surface temperature in Baghdad from 2004 to 2021. Land cover indices such as the Normalized Difference Vegetation Index, Normalized Difference Water Index, and Normalized Difference Built-up Index (NDVI, NDWI, an
... Show MoreThe educational process depends on the means of conveying information from the teacher to the learner. Whenever appropriate, this means the learning process takes place better, faster, and with less effort, and the problem of research lies. The players lack learning in the meta-knowledge curriculum in basketball offensive skills than offensive skills. The aim of the research is to identify the effect of training exercises for educational numbers based on metacognitive skills in teaching some offensive skills with basketball. As for the research assignment, there are statistically significant differences between the results of the pre and post tests for the experimental group and in favor of the post tests. The experimental approach was used
... Show MoreIntroduction: Carrier-based gutta-percha is an effective method of root canal obturation creating a 3-dimensional filling; however, retrieval of the plastic carrier is relatively difficult, particularly with smaller sizes. The purpose of this study was to develop composite carriers consisting of polyethylene (PE), hydroxyapatite (HA), and strontium oxide (SrO) for carrier-based root canal obturation. Methods: Composite fibers of HA, PE, and SrO were fabricated in the shape of a carrier for delivering gutta-percha (GP) using a melt-extrusion process. The fibers were characterized using infrared spectroscopy and the thermal properties determined using differential scanning calorimetry. The elastic modulus and tensile strength tests were dete
... Show MoreIn this study, an efficient compression system is introduced, it is based on using wavelet transform and two types of 3Dimension (3D) surface representations (i.e., Cubic Bezier Interpolation (CBI)) and 1 st order polynomial approximation. Each one is applied on different scales of the image; CBI is applied on the wide area of the image in order to prune the image components that show large scale variation, while the 1 st order polynomial is applied on the small area of residue component (i.e., after subtracting the cubic Bezier from the image) in order to prune the local smoothing components and getting better compression gain. Then, the produced cubic Bezier surface is subtracted from the image signal to get the residue component. Then, t
... Show MoreA three-stage learning algorithm for deep multilayer perceptron (DMLP) with effective weight initialisation based on sparse auto-encoder is proposed in this paper, which aims to overcome difficulties in training deep neural networks with limited training data in high-dimensional feature space. At the first stage, unsupervised learning is adopted using sparse auto-encoder to obtain the initial weights of the feature extraction layers of the DMLP. At the second stage, error back-propagation is used to train the DMLP by fixing the weights obtained at the first stage for its feature extraction layers. At the third stage, all the weights of the DMLP obtained at the second stage are refined by error back-propagation. Network structures an
... Show MoreThe study presents the modification of the Broyden-Flecher-Goldfarb-Shanno (BFGS) update (H-Version) based on the determinant property of inverse of Hessian matrix (second derivative of the objective function), via updating of the vector s ( the difference between the next solution and the current solution), such that the determinant of the next inverse of Hessian matrix is equal to the determinant of the current inverse of Hessian matrix at every iteration. Moreover, the sequence of inverse of Hessian matrix generated by the method would never approach a near-singular matrix, such that the program would never break before the minimum value of the objective function is obtained. Moreover, the new modification of BFGS update (H-vers
... Show More