A fault is an error that has effects on system behaviour. A software metric is a value that represents the degree to which software processes work properly and where faults are more probable to occur. In this research, we study the effects of removing redundancy and log transformation based on threshold values for identifying faults-prone classes of software. The study also contains a comparison of the metric values of an original dataset with those after removing redundancy and log transformation. E-learning and system dataset were taken as case studies. The fault ratio ranged from 1%-31% and 0%-10% for the original dataset and 1%-10% and 0%-4% after removing redundancy and log transformation, respectively. These results impacted directly the number of classes detected, which ranged between 1-20 and 1-7 for the original dataset and 1-7 and 0-3) after removing redundancy and log transformation. The Skewness of the dataset was deceased after applying the proposed model. The classified faulty classes need more attention in the next versions in order to reduce the ratio of faults or to do refactoring to increase the quality and performance of the current version of the software.
Maintaining and breeding fish in a pond are a crucial task for a large fish breeder. The main issues for fish breeders are pond management such as the production of food for fishes and to maintain the pond water quality. The dynamic or technological system for breeders has been invented and becomes important to get maximum profit return for aquaponic breeders in maintaining fishes. This research presents a developed prototype of a dynamic fish feeder based on fish existence. The dynamic fish feeder is programmed to feed where sensors detected the fish's existence. A microcontroller board NodeMCU ESP8266 is programmed for the developed h
... Show MoreRecommender Systems are tools to understand the huge amount of data available in the internet world. Collaborative filtering (CF) is one of the most knowledge discovery methods used positively in recommendation system. Memory collaborative filtering emphasizes on using facts about present users to predict new things for the target user. Similarity measures are the core operations in collaborative filtering and the prediction accuracy is mostly dependent on similarity calculations. In this study, a combination of weighted parameters and traditional similarity measures are conducted to calculate relationship among users over Movie Lens data set rating matrix. The advantages and disadvantages of each measure are spotted. From the study, a n
... Show MoreThe rapid development of telemedicine services and the requirements for exchanging medical information between physicians, consultants, and health institutions have made the protection of patients’ information an important priority for any future e-health system. The protection of medical information, including the cover (i.e. medical image), has a specificity that slightly differs from the requirements for protecting other information. It is necessary to preserve the cover greatly due to its importance on the reception side as medical staff use this information to provide a diagnosis to save a patient's life. If the cover is tampered with, this leads to failure in achieving the goal of telemedicine. Therefore, this work provides an in
... Show MoreComputer Aided Designing Tools of Electron Lenses (CADTEL) is a software packages cares about design, compute and plot simultaneously of the objective and projector properties of electron magnetic lenses. The developments in CADTEL software leads to contain a large fields and methods, adding to previous publish in 2013. The current improvement is inserting of some important parameters which are the resolution and focusing parameters. These parameters are angular semi-angle (α), focusing power (β), resolution limit (δ), image rotation (θ), spherical aberration (Cs), defocus (ΔZ), wave aberration (Χwab), depth of field (Dfld), and depth of focus (Dfoc) a
... Show MoreHuman posture estimation is a crucial topic in the computer vision field and has become a hotspot for research in many human behaviors related work. Human pose estimation can be understood as the human key point recognition and connection problem. The paper presents an optimized symmetric spatial transformation network designed to connect with single-person pose estimation network to propose high-quality human target frames from inaccurate human bounding boxes, and introduces parametric pose non-maximal suppression to eliminate redundant pose estimation, and applies an elimination rule to eliminate similar pose to obtain unique human pose estimation results. The exploratory outcomes demonstrate the way that the proposed technique can pre
... Show MoreThe aim of this paper is to present a new methodology to find the private key of RSA. A new initial value which is generated from a new equation is selected to speed up the process. In fact, after this value is found, brute force attack is chosen to discover the private key. In addition, for a proposed equation, the multiplier of Euler totient function to find both of the public key and the private key is assigned as 1. Then, it implies that an equation that estimates a new initial value is suitable for the small multiplier. The experimental results show that if all prime factors of the modulus are assigned larger than 3 and the multiplier is 1, the distance between an initial value and the private key
... Show MoreIn this paper the wind data that is measured for 12 months (January to December 2011) at Al-Hay district of Wasit province, southern IRAQ country has been analyzed statistically. The wind speed at heights of 10 m above ground level was measured for every 10 minutes interval. The statistical analysis of wind data was performed using WAsP software which is based on Weibull distributions. The Weibull shape and scale parameters is obtained and used in this paper statistics. The achieved results demonstrated that the study area has Annual Mean Energy Production (AMEP) about 219.002 MWh. The computations have been performed on 70m hub‟s height of the turbine and on Earth surface roughness length (0.0, 0.03, 0.1, 0.4, 1.5) m respectively.
The main role of infill drilling is either adding incremental reserves to the already existing one by intersecting newly undrained (virgin) regions or accelerating the production from currently depleted areas. Accelerating reserves from increasing drainage in tight formations can be beneficial considering the time value of money and the cost of additional wells. However, the maximum benefit can be realized when infill wells produce mostly incremental recoveries (recoveries from virgin formations). Therefore, the prediction of incremental and accelerated recovery is crucial in field development planning as it helps in the optimization of infill wells with the assurance of long-term economic sustainabi