<p>In combinatorial testing development, the fabrication of covering arrays is the key challenge by the multiple aspects that influence it. A wide range of combinatorial problems can be solved using metaheuristic and greedy techniques. Combining the greedy technique utilizing a metaheuristic search technique like hill climbing (HC), can produce feasible results for combinatorial tests. Methods based on metaheuristics are used to deal with tuples that may be left after redundancy using greedy strategies; then the result utilization is assured to be near-optimal using a metaheuristic algorithm. As a result, the use of both greedy and HC algorithms in a single test generation system is a good candidate if constructed correctly. This study presents a hybrid greedy hill climbing algorithm (HGHC) that ensures both effectiveness and near-optimal results for generating a small number of test data. To make certain that the suggested HGHC outperforms the most used techniques in terms of test size. It is compared to others in order to determine its effectiveness. In contrast to recent practices utilized for the production of covering arrays (CAs) and mixed covering arrays (MCAs), this hybrid strategy is superior since allowing it to provide the utmost outcome while reducing the size and limit the loss of unique pairings in the CA/MCA generation.</p>
Stereolithography (SLA) has become an essential photocuring 3D printing process for producing parts of complex shapes from photosensitive resin exposed to UV light. The selection of the best printing parameters for good accuracy and surface quality can be further complicated by the geometric complexity of the models. This work introduces multiobjective optimization of SLA printing of 3D dental bridges based on simple CAD objects. The effect of the best combination of a low-cost resin 3D printer’s machine parameter settings, namely normal exposure time, bottom exposure time and bottom layers for less dimensional deviation and surface roughness, was studied. A multiobjective optimization method was utilized, combining the Taguchi me
... Show MoreBackground:Â Various fluids in the oral environment can affect the surface roughness of resin composites. This in vitro study was conducted to determine the influence of the mouth rinses on surface roughness of two methacrylate-based resin (nanofilled and packable composite) and siloraine-based resin composites.
Materials and methods: Disc-shaped specimens (12 mm in diameter and 2mm in height) were prepared from three types of composi
... Show MoreThis paper compare the accurecy of HF propagation prediction programs for HF circuits links between Iraq and different points world wide during August 2018 when solar cycle 24 (start 2009 end 2020) is at minimun activity and also find out the best communication mode used. The prediction programs like Voice of America Coverage Analysis Program (VOACAP) and ITU Recommendation RS 533 (REC533 ) had been used to generat HF circuit link parameters like Maximum Usable Frequency ( MUF) and Frequency of Transsmision (FOT) .Depending on the predicted parameters (data) , real radio contacts had been done using a radio transceiver from Icom model IC 7100 with 100W RF
... Show MoreFace recognition is a crucial biometric technology used in various security and identification applications. Ensuring accuracy and reliability in facial recognition systems requires robust feature extraction and secure processing methods. This study presents an accurate facial recognition model using a feature extraction approach within a cloud environment. First, the facial images undergo preprocessing, including grayscale conversion, histogram equalization, Viola-Jones face detection, and resizing. Then, features are extracted using a hybrid approach that combines Linear Discriminant Analysis (LDA) and Gray-Level Co-occurrence Matrix (GLCM). The extracted features are encrypted using the Data Encryption Standard (DES) for security
... Show MoreAbstract
Bivariate time series modeling and forecasting have become a promising field of applied studies in recent times. For this purpose, the Linear Autoregressive Moving Average with exogenous variable ARMAX model is the most widely used technique over the past few years in modeling and forecasting this type of data. The most important assumptions of this model are linearity and homogenous for random error variance of the appropriate model. In practice, these two assumptions are often violated, so the Generalized Autoregressive Conditional Heteroscedasticity (ARCH) and (GARCH) with exogenous varia
... Show MoreThe factorial analysis method consider a advanced statistical way concern in different ways like physical education field and the purpose to analyze the results that we want to test it or measure or for knowing the dimensions of some correlations between common variables that formed the phenomenon in less number of factors that effect on explanation , so we must depend use the self consistent that achieved for reaching that basic request. The goal of this search that depending on techntion of self consistent degree guessing for choosing perfect way from different methods for (orthogonal & oblique) kinds in physical education factor studies and we select some of references for ( master & doctoral) and also the scientific magazine and confere
... Show MoreData centric techniques, like data aggregation via modified algorithm based on fuzzy clustering algorithm with voronoi diagram which is called modified Voronoi Fuzzy Clustering Algorithm (VFCA) is presented in this paper. In the modified algorithm, the sensed area divided into number of voronoi cells by applying voronoi diagram, these cells are clustered by a fuzzy C-means method (FCM) to reduce the transmission distance. Then an appropriate cluster head (CH) for each cluster is elected. Three parameters are used for this election process, the energy, distance between CH and its neighbor sensors and packet loss values. Furthermore, data aggregation is employed in each CH to reduce the amount of data transmission which le
... Show MoreIdentification of complex communities in biological networks is a critical and ongoing challenge since lots of network-related problems correspond to the subgraph isomorphism problem known in the literature as NP-hard. Several optimization algorithms have been dedicated and applied to solve this problem. The main challenge regarding the application of optimization algorithms, specifically to handle large-scale complex networks, is their relatively long execution time. Thus, this paper proposes a parallel extension of the PSO algorithm to detect communities in complex biological networks. The main contribution of this study is summarized in three- fold; Firstly, a modified PSO algorithm with a local search operator is proposed
... Show MoreThe rapid and enormous growth of the Internet of Things, as well as its widespread adoption, has resulted in the production of massive quantities of data that must be processed and sent to the cloud, but the delay in processing the data and the time it takes to send it to the cloud has resulted in the emergence of fog, a new generation of cloud in which the fog serves as an extension of cloud services at the edge of the network, reducing latency and traffic. The distribution of computational resources to minimize makespan and running costs is one of the disadvantages of fog computing. This paper provides a new approach for improving the task scheduling problem in a Cloud-Fog environme
The aim of this study is to develop a novel framework for managing risks in smart supply chains by enhancing business continuity and resilience against potential disruptions. This research addresses the growing uncertainty in supply chain environments, driven by both natural phenomena-such as pandemics and earthquakes—and human-induced events, including wars, political upheavals, and societal transformations. Recognizing that traditional risk management approaches are insufficient in such dynamic contexts, the study proposes an adaptive framework that integrates proactive and remedial measures for effective risk mitigation. A fuzzy risk matrix is employed to assess and analyze uncertainties, facilitating the identification of disr
... Show More