The application of the test case prioritization method is a key part of system testing intended to think it through and sort out the issues early in the development stage. Traditional prioritization techniques frequently fail to take into account the complexities of big-scale test suites, growing systems and time constraints, therefore cannot fully fix this problem. The proposed study here will deal with a meta-heuristic hybrid method that focuses on addressing the challenges of the modern time. The strategy utilizes genetic algorithms alongside a black hole as a means to create a smooth tradeoff between exploring numerous possibilities and exploiting the best one. The proposed hybrid algorithm of genetic black hole (HGBH) uses the capabilities of considering the imperatives such as code coverage, fault finding rate and execution time from search algorithms in our hybrid approach to refine test cases considerations repetitively. The strategy accomplished this by putting experiments on a large-scale project of industrial software developed. The hybrid meta-heuristic technique ends up being better than the routine techniques. It helps in higher code coverage, which, in turn, enables to detect crucial defects at an early stage and also to allocate the testing resources in a better way. In particular, the best APFD value was 0.9321, which was achieved in 6 generations with 4.879 seconds the value to which the computer was run. Besides these, , the approach resulted in the mean value of APFD as 0.9247 and 0.9302 seconds which took from 10.509 seconds to 30.372 seconds. The carried out experiment proves the feasibility of this approach in implementing complex systems and consistently detecting the changes, enabling it to adapt to rapidly changing systems. In the end, this research provides us with a new hybrid meta-heuristic way of test case prioritization and optimization, which, in turn, helps to tackle the obstacles caused by large-scale test cases and constantly changing systems.
Semantic segmentation is effective in numerous object classification tasks such as autonomous vehicles and scene understanding. With the advent in the deep learning domain, lots of efforts are seen in applying deep learning algorithms for semantic segmentation. Most of the algorithms gain the required accuracy while compromising on their storage and computational requirements. The work showcases the implementation of Convolutional Neural Network (CNN) using Discrete Cosine Transform (DCT), where DCT exhibit exceptional energy compaction properties. The proposed Adaptive Weight Wiener Filter (AWWF) rearranges the DCT coefficients by truncating the high frequency coefficients. AWWF-DCT model reinstate the convolutional l
... Show MoreAs the reservoir conditions are in continuous changing during its life, well production rateand its performance will change and it needs to re-model according to the current situationsand to keep the production rate as high as possible.Well productivity is affected by changing in reservoir pressure, water cut, tubing size andwellhead pressure. For electrical submersible pump (ESP), it will also affected by numberof stages and operating frequency.In general, the production rate increases when reservoir pressure increases and/or water cutdecreases. Also the flow rate increase when tubing size increases and/or wellhead pressuredecreases. For ESP well, production rate increases when number of stages is increasedand/or pump frequency is
... Show MoreBuilding numerical reservoir simulation model with a view to model actual case requires enormous amount of data and information. Such modeling and simulation processes normally require lengthy time and different sets of field data and experimental tests that are usually very expensive. In addition, the availability, quality and accessibility of all necessary data are very limited, especially for the green field. The degree of complexities of such modelling increases significantly especially in the case of heterogeneous nature typically inherited in unconventional reservoirs. In this perspective, this study focuses on exploring the possibility of simplifying the numerical simulation pr
Compression is the reduction in size of data in order to save space or transmission time. For data transmission, compression can be performed on just the data content or on the entire transmission unit (including header data) depending on a number of factors. In this study, we considered the application of an audio compression method by using text coding where audio compression represented via convert audio file to text file for reducing the time to data transfer by communication channel. Approach: we proposed two coding methods are applied to optimizing the solution by using CFG. Results: we test our application by using 4-bit coding algorithm the results of this method show not satisfy then we proposed a new approach to compress audio fil
... Show MoreThe increase in cloud computing services and the large-scale construction of data centers led to excessive power consumption. Datacenters contain a large number of servers where the major power consumption takes place. An efficient virtual machine placement algorithm is substantial to attain energy consumption minimization and improve resource utilization through reducing the number of operating servers. In this paper, an enhanced discrete particle swarm optimization (EDPSO) is proposed. The enhancement of the discrete PSO algorithm is achieved through modifying the velocity update equation to bound the resultant particles and ensuring feasibility. Furthermore, EDPSO is assisted by two heuristic algorithms random first fit (RFF) a
... Show MoreThe present research aimed to test the imagination of children, and may build sample consisted of (400) a baby and child, selected by random way of four Directorates (first Resafe, second Resafe ,first alkarkh , second alkarkh), in order to achieve the objective of research the tow researchers have a test of imagination and extract the virtual and honesty plants distinguish paragraphs and paragraphs and difficulty factor became the test consists of (32), statistical methods were used (Pearson correlation coefficient, coefficient of difficult passages, highlight paragraphs, correlation equation, an equation wrong Standard) the tow researchers have a number of recommendations and proposals.
Each organization struggles to exploit each possible opportunity for gaining success and continuing with its work carrier. In this field, organization success can be concluded by fulfilling end user requirements combined with optimizing available resources usage within a specified time and acceptable quality level to gain maximum profit. The project ranking process is governed by the multi-criteria environment, which is more difficult for the governmental organization because other organizations' main target is maximizing profit constrained with available resources. The governmental organization should consider human, social, economic and many more factors. This paper focused on building a multi-criteria optimizing proje
... Show MoreBackground: Obesity typically results from a variety of causes and factors which contribute, genetics included, and style of living choices, and described as excessive body fat accumulation of body fat lead to excessive body, is a chronic disorder that combines pathogenic environmental and genetic factors. So, the current study objective was to investigate the of the FTO gene rs9939609 polymorphism and the obesity risk. Explaining the relationship between fat mass and obesity-associated gene (FTO) rs9939609 polymorphism and obesity in adults. Methods: Identify research exploring the association between the obesity risk and the variation polymorphisms of FTO gene rs9939609. We combined the modified odds ratios (OR) as total groups and subgro
... Show More