We propose a novel strategy to optimize the test suite required for testing both hardware and software in a production line. Here, the strategy is based on two processes: Quality Signing Process and Quality Verification Process, respectively. Unlike earlier work, the proposed strategy is based on integration of black box and white box techniques in order to derive an optimum test suite during the Quality Signing Process. In this case, the generated optimal test suite significantly improves the Quality Verification Process. Considering both processes, the novelty of the proposed strategy is the fact that the optimization and reduction of test suite is performed by selecting only mutant killing test cases from cumulating t-way test cases. As such, the proposed strategy can potentially enhance the quality of product with minimal cost in terms of overall resource usage and time execution. As a case study, this paper describes the step-by-step application of the strategy for testing a 4-bit Magnitude Comparator Integrated Circuits in a production line. Comparatively, our result demonstrates that the proposed strategy outperforms the traditional block partitioning strategy with the mutant score of 100% to 90%, respectively, with the same number of test cases.
Testing is a vital phase in software development, and having the right amount of test data is an important aspect in speeding up the process. As a result of the integrationist optimization challenge, extensive testing may not always be practicable. There is also a shortage of resources, expenses, and schedules that impede the testing process. One way to explain combinational testing (CT) is as a basic strategy for creating new test cases. CT has been discussed by several scholars while establishing alternative tactics depending on the interactions between parameters. Thus, an investigation into current CT methods was started in order to better understand their capabilities and limitations. In this study, 97 publications were evalua
... Show MoreLasers, with their unique characteristics in terms of excellent beam quality, especially directionality and coherency, make them the solution that is key for many processes that require high precision. Lasers have good susceptibility to integrate with automated systems, which provides high flexibility to reach difficult zones. In addition, as a processing tool, a laser can be considered as a contact-free tool of precise tip that became attractive for high precision machining at the micro and nanoscales for different materials. All of the above advantages may be not enough unless the laser technician/engineer has enough knowledge about the mechanism of interaction between the laser light with the processed material. Several sequential phenom
... Show MoreUltimate oil recovery and displacement efficiency at the pore-scale are controlled by the rock wettability thus there is a growing interest in the wetting behaviour of reservoir rocks as production from fractured oil-wet or mixed-wet limestone formations have remained a key challenge. Conventional waterflooding methods are inefficient in such formation due to poor spontaneous imbibition of water into the oil-wet rock capillaries. However, altering the wettability to water-wet could yield recovery of significant amounts of additional oil thus this study investigates the influence of nanoparticles on wettability alteration. The efficiency of various formulated zirconium-oxide (ZrO2) based nanofluids at different nanoparticle concentrations (0
... Show MoreReliable data transfer and energy efficiency are the essential considerations for network performance in resource-constrained underwater environments. One of the efficient approaches for data routing in underwater wireless sensor networks (UWSNs) is clustering, in which the data packets are transferred from sensor nodes to the cluster head (CH). Data packets are then forwarded to a sink node in a single or multiple hops manners, which can possibly increase energy depletion of the CH as compared to other nodes. While several mechanisms have been proposed for cluster formation and CH selection to ensure efficient delivery of data packets, less attention has been given to massive data co
Today, there are large amounts of geospatial data available on the web such as Google Map (GM), OpenStreetMap (OSM), Flickr service, Wikimapia and others. All of these services called open source geospatial data. Geospatial data from different sources often has variable accuracy due to different data collection methods; therefore data accuracy may not meet the user requirement in varying organization. This paper aims to develop a tool to assess the quality of GM data by comparing it with formal data such as spatial data from Mayoralty of Baghdad (MB). This tool developed by Visual Basic language, and validated on two different study areas in Baghdad / Iraq (Al-Karada and Al- Kadhumiyah). The positional accuracy was asses
... Show MoreLow grade crude palm oil (LGCPO) presents as an attractive option as feedstock for biodiesel production due to its low cost and non-competition with food resources. Typically, LGCPO contains high contents of free fatty acids (FFA), rendering it impossible in direct trans-esterification processes due to the saponification reaction. Esterification is the typical pre-treatment process to reduce the FFA content and to produce fatty acid methyl ester (FAME). The pre-treatment of LGCPO using two different acid catalysts, such as titanium oxysulphate sulphuric acid complex hydrate (TiOSH) and 5-sulfosalicylic acid dihydrate (5-SOCAH) was investigated for the first time in this study. The optimum conditions for the homogenous catalyst (5-SOCAH) wer
... Show MoreThe calibration of a low-speed wind tunnel (LSWT) test section had been made in the present work. The tunnel was designed and constructed at the Aerodynamics Lab. in the Mechanical Engineering Department/University of Baghdad. The test section design speed is 70 m/s. Frictional loses and uniformity of the flow inside the test section had been tested and calibrated based on the British standards for flow inside ducts and conduits. Pitot-static tube, boundary layer Pitot tube were the main instruments which were used in the present work to measure the flow characteristics with emphasize on the velocity uniformity and boundary layer growth along the walls of the test section. It is found that the maximum calibrated velocity for empty test sect
... Show MoreThe calibration of a low-speed wind tunnel (LSWT) test section had been made in the present work. The tunnel was designed and constructed at the Aerodynamics Lab. in the Mechanical Engineering Department/University of Baghdad. The test section design speed is 70 m/s. Frictional loses and uniformity of the flow inside the test section had been tested and calibrated based on the British standards for flow inside ducts and conduits. Pitot-static tube, boundary layer Pitot tube were the main instruments which were used in the present work to measure the flow characteristics with emphasize on the velocity uniformity and boundary layer growth along the walls of the test section. It is found that the maximum calibrated velocity for empty test s
... Show MoreThis research including lineament automated extraction by using PCI Geomatica program, depending on satellite image and lineament analysis by using GIS program. Analysis included density analysis, length density analysis and intersection density analysis. When calculate the slope map for the study area, found the relationship between the slope and lineament density.
The lineament density increases in the regions that have high values for the slope, show that lineament play an important role in the classification process as it isolates the class for the other were observed in Iranian territory, clearly, also show that one of the lineament hit shoulders of Galal Badra dam and the surrounding areas dam. So should take into consideration