PC-based controller is an approach to control systems with Real-Time parameters by controlling selected manipulating variable to accomplish the objectives. Shell and tube heat exchanger have been identified as process models that are inherently nonlinear and hard to control due to unavailability of the exact models’ descriptions. PC and analogue input output card will be used as the controller that controls the heat exchanger hot stream to the desired temperature.
The control methodology by using four speed pump as manipulating variable to control the temperature of the hot stream to cool to the desired temperature.
In this work, the dynamics of cross flow shell and tube heat exchanger is modeled from step changes in cold water flow rate (manipulated variable). The model is identified to be First Order plus Dead Time (FOPDT).
The objective of this work is to design and implement a controller to regulate the outlet temperature of hot water that is taken as controlled variable. The comparison of the designed PI controller with the PC-Based controller performance (according to rise time, percentage overshoot and settling time) shows a good agreement for PC-Based to control the system.
This paper is focused on orthogonal function approximation technique FAT-based adaptive backstepping control of a geared DC motor coupled with a rotational mechanical component. It is assumed that all parameters of the actuator are unknown including the torque-current constant (i.e., unknown input coefficient) and hence a control system with three motor control modes is proposed: 1) motor torque control mode, 2) motor current control mode, and 3) motor voltage control mode. The proposed control algorithm is a powerful tool to control a dynamic system with an unknown input coefficient. Each uncertain parameter/term is represented by a linear combination of weighting and orthogonal basis function vectors. Chebyshev polynomial is used
... Show MoreCompanies compete greatly with each other today, so they need to focus on innovation to develop their products and make them competitive. Lean product development is the ideal way to develop product, foster innovation, maximize value, and reduce time. Set-Based Concurrent Engineering (SBCE) is an approved lean product improvement mechanism that builds on the creation of a number of alternative designs at the subsystem level. These designs are simultaneously improved and tested, and the weaker choices are removed gradually until the optimum solution is reached finally. SBCE implementations have been extensively performed in the automotive industry and there are a few case studies in the aerospace industry. This research describe the use o
... Show MoreThe research deals with an evolutionary-based mutation with functional annotation to identify protein complexes within PPI networks. An important field of research in computational biology is the difficult and fundamental challenge of revealing complexes in protein interaction networks. The complex detection models that have been developed to tackle challenges are mostly dependent on topological properties and rarely use the biological properties of PPI networks. This research aims to push the evolutionary algorithm to its maximum by employing gene ontology (GO) to communicate across proteins based on biological information similarity for direct genes. The outcomes show that the suggested method can be utilized to improve the
... Show MoreArtificial fish swarm algorithm (AFSA) is one of the critical swarm intelligent algorithms. In this
paper, the authors decide to enhance AFSA via diversity operators (AFSA-DO). The diversity operators will
be producing more diverse solutions for AFSA to obtain reasonable resolutions. AFSA-DO has been used to
solve flexible job shop scheduling problems (FJSSP). However, the FJSSP is a significant problem in the
domain of optimization and operation research. Several research papers dealt with methods of solving this
issue, including forms of intelligence of the swarms. In this paper, a set of FJSSP target samples are tested
employing the improved algorithm to confirm its effectiveness and evaluate its ex
<p>In combinatorial testing development, the fabrication of covering arrays is the key challenge by the multiple aspects that influence it. A wide range of combinatorial problems can be solved using metaheuristic and greedy techniques. Combining the greedy technique utilizing a metaheuristic search technique like hill climbing (HC), can produce feasible results for combinatorial tests. Methods based on metaheuristics are used to deal with tuples that may be left after redundancy using greedy strategies; then the result utilization is assured to be near-optimal using a metaheuristic algorithm. As a result, the use of both greedy and HC algorithms in a single test generation system is a good candidate if constructed correctly. T
... Show MoreSocial media is known as detectors platform that are used to measure the activities of the users in the real world. However, the huge and unfiltered feed of messages posted on social media trigger social warnings, particularly when these messages contain hate speech towards specific individual or community. The negative effect of these messages on individuals or the society at large is of great concern to governments and non-governmental organizations. Word clouds provide a simple and efficient means of visually transferring the most common words from text documents. This research aims to develop a word cloud model based on hateful words on online social media environment such as Google News. Several steps are involved including data acq
... Show MoreGovernmental establishments are maintaining historical data for job applicants for future analysis of predication, improvement of benefits, profits, and development of organizations and institutions. In e-government, a decision can be made about job seekers after mining in their information that will lead to a beneficial insight. This paper proposes the development and implementation of an applicant's appropriate job prediction system to suit his or her skills using web content classification algorithms (Logit Boost, j48, PART, Hoeffding Tree, Naive Bayes). Furthermore, the results of the classification algorithms are compared based on data sets called "job classification data" sets. Experimental results indicate
... Show MoreThis paper include the problem of segmenting an image into regions represent (objects), segment this object by define boundary between two regions using a connected component labeling. Then develop an efficient segmentation algorithm based on this method, to apply the algorithm to image segmentation using different kinds of images, this algorithm consist four steps at the first step convert the image gray level the are applied on the image, these images then in the second step convert to binary image, edge detection using Canny edge detection in third Are applie the final step is images. Best segmentation rates are (90%) obtained when using the developed algorithm compared with (77%) which are obtained using (ccl) before enhancement.
Abstract
Zigbee is considered to be one of the wireless sensor networks (WSNs) designed for short-range communications applications. It follows IEEE 802.15.4 specifications that aim to design networks with lowest cost and power consuming in addition to the minimum possible data rate. In this paper, a transmitter Zigbee system is designed based on PHY layer specifications of this standard. The modulation technique applied in this design is the offset quadrature phase shift keying (OQPSK) with half sine pulse-shaping for achieving a minimum possible amount of phase transitions. In addition, the applied spreading technique is direct sequence spread spectrum (DSSS) technique, which has
... Show MoreIn this paper a modified approach have been used to find the approximate solution of ordinary delay differential equations with constant delay using the collocation method based on Bernstien polynomials.