In today's world, the science of bioinformatics is developing rapidly, especially with regard to the analysis and study of biological networks. Scientists have used various nature-inspired algorithms to find protein complexes in protein-protein interaction (PPI) networks. These networks help scientists guess the molecular function of unknown proteins and show how cells work regularly. It is very common in PPI networks for a protein to participate in multiple functions and belong to many complexes, and as a result, complexes may overlap in the PPI networks. However, developing an efficient and reliable method to address the problem of detecting overlapping protein complexes remains a challenge since it is considered a complex and hard optimization problem. One of the main difficulties in identifying overlapping protein complexes is the accuracy of the partitioning results. In order to accurately identify the overlapping structure of protein complexes, this paper has proposed an overlapping complex detection algorithm termed OCDPSO-Net, which is based on PSO-Net (a well-known modified version of the particle swarm optimization algorithm). The framework of the OCDPSO-Net method consists of three main steps, including an initialization strategy, a movement strategy for each particle, and enhancing search ability in order to expand the solution space. The proposed algorithm has employed the partition density concept for measuring the partitioning quality in PPI network complexes and tried to optimize the value of this quantity by applying the line graph concept of the original graph representing the protein interaction network. The OCDPSO-Net algorithm is applied to a Collins PPI network and the obtained results are compared with different state-of-the-art algorithms in terms of precision ( ), recall ( ), and F-measure ( ). Experimental results confirm that the proposed algorithm has good clustering performance and has outperformed most of the existing recent overlapping algorithms. .
Conclusion The observation of the phenomenon of structural evolution of the international system and its instability on a particular situation, by its transition from unipolar to polarity to bipolarism and then to unilateralism in the early 1990s led by the United States, and to the present moment, To say that the structure by which the hierarchy of superpowers or the regime is directed in terms of its various capacities that qualify it, and with the consent of the rest of the States directing the regime to lead and lead the world's first place, has no direct relation to the stability of this system, I hope other more influential in its stability. The structure of the new international order will be completely different in terms of the r
... Show MoreThis study proposed control system that has been presented to control the electron lens resistance in order to obtain a stabilized electron lens power. This study will layout the fundamental challenges, hypothetical plan arrangements and development condition for the Integrable Optics Test Accelerator (IOTA) in progress at Fermilab. Thus, an effective automatic gain control (AGC) unit has been introduced which prevents fluctuations in the internal resistance of the electronic lens caused by environmental influences to affect the system's current and power values and keep them in stable amounts. Utilizing this unit has obtained level balanced out system un impacted with electronic lens surrounding natural varieties.
In this study, the optimum conditions for COD removal from petroleum refinery wastewater by using a combined electrocoagulation- electro-oxidation system were attained by Taguchi method. An orthogonal array experimental design (L18) which is of four controllable parameters including NaCl concentration, C.D. (current density), PH, and time (time of electrolysis) was employed. Chemical oxygen demand (COD) removal percentage was considered as the quality characteristics to be enhanced. Also, the value of turbidity and TDS (total dissolved solid) were estimated. The optimum levels of the studied parameters were determined precisely by implementing S/N analysis and analysis of variance (ANOVA). The optimum conditions were found to be NaCl = 2.5
... Show MoreThis paper describes a new finishing process using magnetic abrasives were newly made to finish effectively brass plate that is very difficult to be polished by the conventional machining processes. Taguchi experimental design method was adopted for evaluating the effect of the process parameters on the improvement of the surface roughness and hardness by the magnetic abrasive polishing. The process parameters are: the applied current to the inductor, the working gap between the workpiece and the inductor, the rotational speed and the volume of powder. The analysis of variance(ANOVA) was analyzed using statistical software to identify the optimal conditions for better surface roughness and hardness. Regressions models based on statistical m
... Show MoreThis research proposes the application of the dragonfly and fruit fly algorithms to enhance estimates generated by the Fama-MacBeth model and compares their performance in this context for the first time. To specifically improve the dragonfly algorithm's effectiveness, three parameter tuning approaches are investigated: manual parameter tuning (MPT), adaptive tuning by methodology (ATY), and a novel technique called adaptive tuning by performance (APT). Additionally, the study evaluates the estimation performance using kernel weighted regression (KWR) and explores how the dragonfly and fruit fly algorithms can be employed to enhance KWR. All methods are tested using data from the Iraq Stock Exchange, based on the Fama-French three-f
... Show More
The research aims to study and analysis of concurrent engineering (CE) and cost optimization (CO), and the use of concurrent engineering inputs to outputs to improve the cost, and the statement of the role of concurrent engineering in improving the quality of the product, and achieve savings in the design and manufacturing time and assembly and reduce costs, as well as employing some models to determine how much the savings in time, including the model (Lexmark) model (Pert) to determine the savings in design time for manufacturing and assembly time.
To achieve the search objectives, the General Company for Electrical and Electronic Industries \ Refrigerated Engine
... Show MoreAbstract
This study investigated the optimization of wear behavior of AISI 4340 steel based on the Taguchi method under various testing conditions. In this paper, a neural network and the Taguchi design method have been implemented for minimizing the wear rate in 4340 steel. A back-propagation neural network (BPNN) was developed to predict the wear rate. In the development of a predictive model, wear parameters like sliding speed, applying load and sliding distance were considered as the input model variables of the AISI 4340 steel. An analysis of variance (ANOVA) was used to determine the significant parameter affecting the wear rate. Finally, the Taguchi approach was applied to determine
... Show MoreTraditionally, path selection within routing is formulated as a shortest path optimization problem. The objective function for optimization could be any one variety of parameters such as number of hops, delay, cost...etc. The problem of least cost delay constraint routing is studied in this paper since delay constraint is very common requirement of many multimedia applications and cost minimization captures the need to
distribute the network. So an iterative algorithm is proposed in this paper to solve this problem. It is appeared from the results of applying this algorithm that it gave the optimal path (optimal solution) from among multiple feasible paths (feasible solutions).
The application of the test case prioritization method is a key part of system testing intended to think it through and sort out the issues early in the development stage. Traditional prioritization techniques frequently fail to take into account the complexities of big-scale test suites, growing systems and time constraints, therefore cannot fully fix this problem. The proposed study here will deal with a meta-heuristic hybrid method that focuses on addressing the challenges of the modern time. The strategy utilizes genetic algorithms alongside a black hole as a means to create a smooth tradeoff between exploring numerous possibilities and exploiting the best one. The proposed hybrid algorithm of genetic black hole (HGBH) uses the
... Show MoreDue to the need for controlling and regulating of feed pellet. Pellet that is imported or locally manufactured is accompanied by cracking and crumbling percentage that occur during transporting and distributing to animals, using conveyors and mechanical feeders. This study aimed to determine the effect of particle size and die holes diameter in the machine on broiler feed pellets quality in pellet durability, pellet direct measurement, pellet expansion, and pellet length. Three particle size 2, 4, and 6 mm, and three diameters of die holes in the machine 3, 4, and 5 mm, have been used. The results showed that changing the particle size from 2 to 4 then to 6 mm led to a significant decrease in pellet durability and pellet lengths, pe
... Show More