The application of the test case prioritization method is a key part of system testing intended to think it through and sort out the issues early in the development stage. Traditional prioritization techniques frequently fail to take into account the complexities of big-scale test suites, growing systems and time constraints, therefore cannot fully fix this problem. The proposed study here will deal with a meta-heuristic hybrid method that focuses on addressing the challenges of the modern time. The strategy utilizes genetic algorithms alongside a black hole as a means to create a smooth tradeoff between exploring numerous possibilities and exploiting the best one. The proposed hybrid algorithm of genetic black hole (HGBH) uses the capabilities of considering the imperatives such as code coverage, fault finding rate and execution time from search algorithms in our hybrid approach to refine test cases considerations repetitively. The strategy accomplished this by putting experiments on a large-scale project of industrial software developed. The hybrid meta-heuristic technique ends up being better than the routine techniques. It helps in higher code coverage, which, in turn, enables to detect crucial defects at an early stage and also to allocate the testing resources in a better way. In particular, the best APFD value was 0.9321, which was achieved in 6 generations with 4.879 seconds the value to which the computer was run. Besides these, , the approach resulted in the mean value of APFD as 0.9247 and 0.9302 seconds which took from 10.509 seconds to 30.372 seconds. The carried out experiment proves the feasibility of this approach in implementing complex systems and consistently detecting the changes, enabling it to adapt to rapidly changing systems. In the end, this research provides us with a new hybrid meta-heuristic way of test case prioritization and optimization, which, in turn, helps to tackle the obstacles caused by large-scale test cases and constantly changing systems.
The e-news is one of the most important journalistic arts in new media (the Internet). The process of telling the story by the journalist is an important aspect of the communicative process between the users of the internet and the reporter. The electronic news is characterized by having text, fixed images, animations, videos and sound. All these give greater vitality to the communicative process and increase the semiotic dimensions. Also, it makes the narrative process more distinctive and embodied of the elements of the event. This research studies all these aspects and tries to show the distinction between the semiotics of narration and the electronic news.
The issue of image captioning, which comprises automatic text generation to understand an image’s visual information, has become feasible with the developments in object recognition and image classification. Deep learning has received much interest from the scientific community and can be very useful in real-world applications. The proposed image captioning approach involves the use of Convolution Neural Network (CNN) pre-trained models combined with Long Short Term Memory (LSTM) to generate image captions. The process includes two stages. The first stage entails training the CNN-LSTM models using baseline hyper-parameters and the second stage encompasses training CNN-LSTM models by optimizing and adjusting the hyper-parameters of
... Show MoreThe aim of human lower limb rehabilitation robot is to regain the ability of motion and to strengthen the weak muscles. This paper proposes the design of a force-position control for a four Degree Of Freedom (4-DOF) lower limb wearable rehabilitation robot. This robot consists of a hip, knee and ankle joints to enable the patient for motion and turn in both directions. The joints are actuated by Pneumatic Muscles Actuators (PMAs). The PMAs have very great potential in medical applications because the similarity to biological muscles. Force-Position control incorporating a Takagi-Sugeno-Kang- three- Proportional-Derivative like Fuzzy Logic (TSK-3-PD) Controllers for position control and three-Proportional (3-P) controllers for force contr
... Show MoreThis paper discusses an optimal path planning algorithm based on an Adaptive Multi-Objective Particle Swarm Optimization Algorithm (AMOPSO) for two case studies. First case, single robot wants to reach a goal in the static environment that contain two obstacles and two danger source. The second one, is improving the ability for five robots to reach the shortest way. The proposed algorithm solves the optimization problems for the first case by finding the minimum distance from initial to goal position and also ensuring that the generated path has a maximum distance from the danger zones. And for the second case, finding the shortest path for every robot and without any collision between them with the shortest time. In ord
... Show MoreA spectrophotometric determination of azithromycin was optimized using the simplex model. The approach has been proven to be accurate and sensitive. The analyte has been reacted with bromothymol blue (BTB) to form a colored ion pair which has been extracted in chloroform in a buffer medium of pH=4 of potassium phthalate. The extracted colored product was assayed at 415 nm and exhibited a linear quantification range over (1 - 20) g/ml. The excipients did not exhibit any interferences with the proposed approach for assaying azithromycin in pharmaceutical formulations.
This study proposes a hybrid predictive maintenance framework that integrates the Kolmogorov-Arnold Network (KAN) with Short-Time Fourier Transform (STFT) for intelligent fault diagnosis in industrial rotating machinery. The method is designed to address challenges posed by non-linear and non-stationary vibration signals under varying operational conditions. Experimental validation using the FALEX multispecimen test bench demonstrated a high classification accuracy of 97.5%, outperforming traditional models such as SVM, Random Forest, and XGBoost. The approach maintained robust performance across dynamic load scenarios and noisy environments, with precision and recall exceeding 95%. Key contributions include a hardware-accelerated K
... Show MoreMany consumers of electric power have excesses in their electric power consumptions that exceed the permissible limit by the electrical power distribution stations, and then we proposed a validation approach that works intelligently by applying machine learning (ML) technology to teach electrical consumers how to properly consume without wasting energy expended. The validation approach is one of a large combination of intelligent processes related to energy consumption which is called the efficient energy consumption management (EECM) approaches, and it connected with the internet of things (IoT) technology to be linked to Google Firebase Cloud where a utility center used to check whether the consumption of the efficient energy is s
... Show More