The application of the test case prioritization method is a key part of system testing intended to think it through and sort out the issues early in the development stage. Traditional prioritization techniques frequently fail to take into account the complexities of big-scale test suites, growing systems and time constraints, therefore cannot fully fix this problem. The proposed study here will deal with a meta-heuristic hybrid method that focuses on addressing the challenges of the modern time. The strategy utilizes genetic algorithms alongside a black hole as a means to create a smooth tradeoff between exploring numerous possibilities and exploiting the best one. The proposed hybrid algorithm of genetic black hole (HGBH) uses the capabilities of considering the imperatives such as code coverage, fault finding rate and execution time from search algorithms in our hybrid approach to refine test cases considerations repetitively. The strategy accomplished this by putting experiments on a large-scale project of industrial software developed. The hybrid meta-heuristic technique ends up being better than the routine techniques. It helps in higher code coverage, which, in turn, enables to detect crucial defects at an early stage and also to allocate the testing resources in a better way. In particular, the best APFD value was 0.9321, which was achieved in 6 generations with 4.879 seconds the value to which the computer was run. Besides these, , the approach resulted in the mean value of APFD as 0.9247 and 0.9302 seconds which took from 10.509 seconds to 30.372 seconds. The carried out experiment proves the feasibility of this approach in implementing complex systems and consistently detecting the changes, enabling it to adapt to rapidly changing systems. In the end, this research provides us with a new hybrid meta-heuristic way of test case prioritization and optimization, which, in turn, helps to tackle the obstacles caused by large-scale test cases and constantly changing systems.
Database is characterized as an arrangement of data that is sorted out and disseminated in a way that allows the client to get to the data being put away in a simple and more helpful way. However, in the era of big-data the traditional methods of data analytics may not be able to manage and process the large amount of data. In order to develop an efficient way of handling big-data, this work studies the use of Map-Reduce technique to handle big-data distributed on the cloud. This approach was evaluated using Hadoop server and applied on EEG Big-data as a case study. The proposed approach showed clear enhancement for managing and processing the EEG Big-data with average of 50% reduction on response time. The obtained results provide EEG r
... Show MoreLattakia city faces many problems related to the mismanagement of solid waste, as the disposal process is limited to the random Al-Bassa landfill without treatment. Therefore, solid waste management poses a special challenge to decision-makers by choosing the appropriate tool that supports strategic decisions in choosing municipal solid waste treatment methods and evaluating their management systems. As the human is primarily responsible for the formation of waste, this study aims to measure the degree of environmental awareness in the Lattakia Governorate from the point of view of the research sample members and to discuss the effect of the studied variables (place of residence, educational level, gender, age, and professional status) o
... Show MorePurpose: We report a series of 29 pediatric patients who sustained head injuries due to metallic ceiling fans. They all were admitted to the Emergency Department of Neurosurgery Teaching Hospital in Baghdad, Iraq, during January 2015 to January 2017. Results: Pediatric ceiling fan head injuries are characterized by four traits which distinguish them from other types of head injuries; 1- Most of them were because of climbing on or jumping from furniture between the ages of two and five. 2- Most of them sustained compound depressed skull fracture which associated with intracranial lesions and pneumocephalus. 3- The most common indication for surgical intervention was because of dirty wound which mixed with hairs. 4- These variables were stati
... Show MoreENGLISH
ABSTRACT:
The study aims at expounding the correlation and effect between the Human resource development strategy and Quality Municipality Service within a theoretical framework and a practical framework conducted at Directorate Of Municipalities in holy Karbala . The researcher found during a pilot study that there isn’t enough care paid by the Directorate Of Municipalities in developing its human resources using one strategy or a number of strategies and their effect on the Quality Municipality Service. Thus a number of research questions were set concerning the existence of clear perception in the Directorates Of Municipalities concerning the strategies of developing both the human resource an Qualit
... Show MoreThis paper deals with a Twin Rotor Aerodynamic System (TRAS). It is a Multi-Input Multi-Output (MIMO) system with high crosscoupling between its two channels. It proposes a hybrid design procedure that combines frequency response and root locus approaches. The proposed controller is designated as PID-Lead Compensator (PIDLC); the PID controller was designed in previous work using frequency response design specifications, while the lead compensator is proposed in this paper and is designed using the root locus method. A general explicit formula for angle computations in any of the four quadrants is also given. The lead compensator is designed by shifting the dominant closed-loop poles slightly to the left in the s-plane. This has the effect
... Show MoreIn this paper, the goal of proposed method is to protect data against different types of attacks by unauthorized parties. The basic idea of proposed method is generating a private key from a specific features of digital color image such as color (Red, Green and Blue); the generating process of private key from colors of digital color image performed via the computing process of color frequencies for blue color of an image then computing the maximum frequency of blue color, multiplying it by its number and adding process will performed to produce a generated key. After that the private key is generated, must be converting it into the binary representation form. The generated key is extracted from blue color of keyed image then we selects a c
... Show MoreThe choice of binary Pseudonoise (PN) sequences with specific properties, having long period high complexity, randomness, minimum cross and auto- correlation which are essential for some communication systems. In this research a nonlinear PN generator is introduced . It consists of a combination of basic components like Linear Feedback Shift Register (LFSR), ?-element which is a type of RxR crossbar switches. The period and complexity of a sequence which are generated by the proposed generator are computed and the randomness properties of these sequences are measured by well-known randomness tests.
The main challenge is to protect the environment from future deterioration due to pollution and the lack of natural resources. Therefore, one of the most important things to pay attention to and get rid of its negative impact is solid waste. Solid waste is a double-edged sword according to the way it is dealt with, as neglecting it causes a serious environmental risk from water, air and soil pollution, while dealing with it in the right way makes it an important resource in preserving the environment. Accordingly, the proper management of solid waste and its reuse or recycling is the most important factor. Therefore, attention has been drawn to the use of solid waste in different ways, and the most common way is to use it as an alternative
... Show MoreFace recognition is a crucial biometric technology used in various security and identification applications. Ensuring accuracy and reliability in facial recognition systems requires robust feature extraction and secure processing methods. This study presents an accurate facial recognition model using a feature extraction approach within a cloud environment. First, the facial images undergo preprocessing, including grayscale conversion, histogram equalization, Viola-Jones face detection, and resizing. Then, features are extracted using a hybrid approach that combines Linear Discriminant Analysis (LDA) and Gray-Level Co-occurrence Matrix (GLCM). The extracted features are encrypted using the Data Encryption Standard (DES) for security
... Show More