Regression testing being expensive, requires optimization notion. Typically, the optimization of test cases results in selecting a reduced set or subset of test cases or prioritizing the test cases to detect potential faults at an earlier phase. Many former studies revealed the heuristic-dependent mechanism to attain optimality while reducing or prioritizing test cases. Nevertheless, those studies were deprived of systematic procedures to manage tied test cases issue. Moreover, evolutionary algorithms such as the genetic process often help in depleting test cases, together with a concurrent decrease in computational runtime. However, when examining the fault detection capacity along with other parameters, is required, the method falls short. The current research is motivated by this concept and proposes a multifactor algorithm incorporated with genetic operators and powerful features. A factor-based prioritizer is introduced for proper handling of tied test cases that emerged while implementing re-ordering. Besides this, a Cost-based Fine Tuner (CFT) is embedded in the study to reveal the stable test cases for processing. The effectiveness of the outcome procured through the proposed minimization approach is anatomized and compared with a specific heuristic method (rule-based) and standard genetic methodology. Intra-validation for the result achieved from the reduction procedure is performed graphically. This study contrasts randomly generated sequences with procured re-ordered test sequence for over '10' benchmark codes for the proposed prioritization scheme. Experimental analysis divulged that the proposed system significantly managed to achieve a reduction of 35-40% in testing effort by identifying and executing stable and coverage efficacious test cases at an earlier phase.
With the development of cloud computing during the latest years, data center networks have become a great topic in both industrial and academic societies. Nevertheless, traditional methods based on manual and hardware devices are burdensome, expensive, and cannot completely utilize the ability of physical network infrastructure. Thus, Software-Defined Networking (SDN) has been hyped as one of the best encouraging solutions for future Internet performance. SDN notable by two features; the separation of control plane from the data plane, and providing the network development by programmable capabilities instead of hardware solutions. Current paper introduces an SDN-based optimized Resch
Gas-lift technique plays an important role in sustaining oil production, especially from a mature field when the reservoirs’ natural energy becomes insufficient. However, optimally allocation of the gas injection rate in a large field through its gas-lift network system towards maximization of oil production rate is a challenging task. The conventional gas-lift optimization problems may become inefficient and incapable of modelling the gas-lift optimization in a large network system with problems associated with multi-objective, multi-constrained, and limited gas injection rate. The key objective of this study is to assess the feasibility of utilizing the Genetic Algorithm (GA) technique to optimize t
In recent years, there has been expanding development in the vehicular part and the number of vehicles moving on the roads in all the sections of the country. Arabic vehicle number plate identification based on image processing is a dynamic area of this work; this technique is used for security purposes such as tracking of stolen cars and access control to restricted areas. The License Plate Recognition System (LPRS) exploits a digital camera to capture vehicle plate numbers is used as input to the proposed recognition system. Basically, the proposed system consists of three phases, vehicle license plate localization, character segmentation, and character recognition, the
... Show MoreThis research deals with leverage and its impact on the profitability of Islamic banks in Iraq for the years (2015-2018), the research variables of leverage have been analyzed and measured as an independent variable, profitability as a dependent variable, and the research is based on a main hypothesis: there is a moral relationship of statistical significance. Between leverage and profitability at The Islamic Cooperation Bank for the period (2015-2018). The results of the research showed that there are moral (exorcist) relationships between the ratio of leverage and profitability indicators, as the higher the leverage ratio, the higher the profitability indicators. In addition, The Islamic Cooperation Bank has adopted a conservative poli
... Show MoreThe purpose of this research is to demonstrate the impact of deposit insurance to reduce banking risks, as banks in various countries of the world face a variety of risks that led to banking and financial crises that led to the failure and bankruptcy of many of its bank, which led to the banks to find quick and appropriate solutions to get rid of these difficulties These solutions include the use of bank deposit protection system for the many risks and sequences of crises that accompanied the Iraqi banking work of thefts, forgery, embezzlement and changing and unstable circumstances. The importance of studying the subject of research through the theoretical framework of banking risks as well as the framework of consideration In order to
... Show MoreIn recent years, the need for Machine Translation (MT) has grown, especially for translating legal contracts between languages like Arabic and English. This study primarily investigates whether Google Translator can adequately replace human translation for legal documents. Utilizing a widely popular free web-based tool, Google Translate, the research method involved translating six segments from various legal contracts into Arabic and assessing the translations for lexical and syntactic accuracy. The findings show that although Google Translate can quickly produce English-Arabic translations, it falls short compared to professional translators, especially with complex legal terms and syntax. Errors can be categorized into: polysemy,
... Show MoreJPEG is most popular image compression and encoding, this technique is widely used in many applications (images, videos and 3D animations). Meanwhile, researchers are very interested to develop this massive technique to compress images at higher compression ratios with keeping image quality as much as possible. For this reason in this paper we introduce a developed JPEG based on fast DCT and removed most of zeros and keeps their positions in a transformed block. Additionally, arithmetic coding applied rather than Huffman coding. The results showed up, the proposed developed JPEG algorithm has better image quality than traditional JPEG techniques.
Determining the face of wearing a mask from not wearing a mask from visual data such as video and still, images have been a fascinating research topic in recent decades due to the spread of the Corona pandemic, which has changed the features of the entire world and forced people to wear a mask as a way to prevent the pandemic that has calmed the entire world, and it has played an important role. Intelligent development based on artificial intelligence and computers has a very important role in the issue of safety from the pandemic, as the Topic of face recognition and identifying people who wear the mask or not in the introduction and deep education was the most prominent in this topic. Using deep learning techniques and the YOLO (”You on
... Show MoreAssociation rules mining (ARM) is a fundamental and widely used data mining technique to achieve useful information about data. The traditional ARM algorithms are degrading computation efficiency by mining too many association rules which are not appropriate for a given user. Recent research in (ARM) is investigating the use of metaheuristic algorithms which are looking for only a subset of high-quality rules. In this paper, a modified discrete cuckoo search algorithm for association rules mining DCS-ARM is proposed for this purpose. The effectiveness of our algorithm is tested against a set of well-known transactional databases. Results indicate that the proposed algorithm outperforms the existing metaheuristic methods.
The Internet is providing vital communications between millions of individuals. It is also more and more utilized as one of the commerce tools; thus, security is of high importance for securing communications and protecting vital information. Cryptography algorithms are essential in the field of security. Brute force attacks are the major Data Encryption Standard attacks. This is the main reason that warranted the need to use the improved structure of the Data Encryption Standard algorithm. This paper proposes a new, improved structure for Data Encryption Standard to make it secure and immune to attacks. The improved structure of Data Encryption Standard was accomplished using standard Data Encryption Standard with a new way of two key gene
... Show More