<p>In combinatorial testing development, the fabrication of covering arrays is the key challenge by the multiple aspects that influence it. A wide range of combinatorial problems can be solved using metaheuristic and greedy techniques. Combining the greedy technique utilizing a metaheuristic search technique like hill climbing (HC), can produce feasible results for combinatorial tests. Methods based on metaheuristics are used to deal with tuples that may be left after redundancy using greedy strategies; then the result utilization is assured to be near-optimal using a metaheuristic algorithm. As a result, the use of both greedy and HC algorithms in a single test generation system is a good candidate if constructed correctly. This study presents a hybrid greedy hill climbing algorithm (HGHC) that ensures both effectiveness and near-optimal results for generating a small number of test data. To make certain that the suggested HGHC outperforms the most used techniques in terms of test size. It is compared to others in order to determine its effectiveness. In contrast to recent practices utilized for the production of covering arrays (CAs) and mixed covering arrays (MCAs), this hybrid strategy is superior since allowing it to provide the utmost outcome while reducing the size and limit the loss of unique pairings in the CA/MCA generation.</p>
An Optimal Algorithm for HTML Page Building Process
Abstract
Hexapod robot is a flexible mechanical robot with six legs. It has the ability to walk over terrain. The hexapod robot look likes the insect so it has the same gaits. These gaits are tripod, wave and ripple gaits. Hexapod robot needs to stay statically stable at all the times during each gait in order not to fall with three or more legs continuously contacts with the ground. The safety static stability walking is called (the stability margin). In this paper, the forward and inverse kinematics are derived for each hexapod’s leg in order to simulate the hexapod robot model walking using MATLAB R2010a for all gaits and the geometry in order to derive the equations of the sub-constraint workspaces for each
... Show MoreFor several applications, it is very important to have an edge detection technique matching human visual contour perception and less sensitive to noise. The edge detection algorithm describes in this paper based on the results obtained by Maximum a posteriori (MAP) and Maximum Entropy (ME) deblurring algorithms. The technique makes a trade-off between sharpening and smoothing the noisy image. One of the advantages of the described algorithm is less sensitive to noise than that given by Marr and Geuen techniques that considered to be the best edge detection algorithms in terms of matching human visual contour perception.
Krawtchouk polynomials (KPs) and their moments are promising techniques for applications of information theory, coding theory, and signal processing. This is due to the special capabilities of KPs in feature extraction and classification processes. The main challenge in existing KPs recurrence algorithms is that of numerical errors, which occur during the computation of the coefficients in large polynomial sizes, particularly when the KP parameter (p) values deviate away from 0.5 to 0 and 1. To this end, this paper proposes a new recurrence relation in order to compute the coefficients of KPs in high orders. In particular, this paper discusses the development of a new algorithm and presents a new mathematical model for computing the
... Show MoreSolid waste is a major issue in today's world. Which can be a contributing factor to pollution and the spread of vector-borne diseases. Because of its complicated nonlinear processes, this problem is difficult to model and optimize using traditional methods. In this study, a mathematical model was developed to optimize the cost of solid waste recycling and management. In the optimization phase, the salp swarm algorithm (SSA) is utilized to determine the level of discarded solid waste and reclaimed solid waste. An optimization technique SSA is a new method of finding the ideal solution for a mathematical relationship based on leaders and followers. It takes a lot of random solutions, as well as their outward or inward fluctuations, t
... Show MoreIn many video and image processing applications, the frames are partitioned into blocks, which are extracted and processed sequentially. In this paper, we propose a fast algorithm for calculation of features of overlapping image blocks. We assume the features are projections of the block on separable 2D basis functions (usually orthogonal polynomials) where we benefit from the symmetry with respect to spatial variables. The main idea is based on a construction of auxiliary matrices that virtually extends the original image and makes it possible to avoid a time-consuming computation in loops. These matrices can be pre-calculated, stored and used repeatedly since they are independent of the image itself. We validated experimentally th
... Show MoreThis study offers numerical simulation results using the ABAQUS/CAE version 2019 finite element computer application to examine the performance, and residual strength of eight recycle aggregate RC one-way slabs. Six strengthened by NSM CFRP plates were presented to study the impact of several parameters on their structural behavior. The experimental results of four selected slabs under monotonic load, plus one slab under repeated load, were validated numerically. Then the numerical analysis was extended to different parameters investigation, such as the impact of added CFRP length on ultimate load capacity and load-deflection response and the impact of concrete compressive strength value on the structural performance of
... Show More