In all process industries, the process variables like flow, pressure, level, concentration
and temperature are the main parameters that need to be controlled in both set point
and load changes.
A control system of propylene glycol production in a non isothermal (CSTR) was
developed in this work where the dynamic and control system based on basic mass
and energy balance were carried out.
Inlet concentration and temperature are the two disturbances, while the inlet
volumetric flow rate and the coolant temperature are the two manipulations. The
objective is to maintain constant temperature and concentration within the CSTR.
A dynamic model for non isothermal CSTR is described by a first order plus dead
time (FOPDT).
The conventional PI and PID control were studied and the tuning of control
parameters was found by Ziegler-Nichols reaction curve tuning method to find the
best values of proportional gain (Kc), integral time ( I) and derivative time ( D).
The conventional controller tuning is compared with IMC techniques in this work and
it was found that the Ziegler –Nichols controller provides the best control for the
disturbance and the worst for the set-point change, while the IMC controller results
show satisfactory set-point responses but sluggish disturbance responses because the
approximate FOPTD model has relatively small time delay.
Feedforward and feedforward combined with feedback control systems were used as
another strategy to compare with above strategies. Feedforward control provides a
better response to disturbance rejection than feedback control with a steady state
deviation (offset). Thus, a combined feedforward-feedback control system is preferred
in practice where feedforward control is used to reduce the effects of measurable
disturbances, while feedback trim compensates for inaccuracies in the process model,
measurement error, and unmeasured disturbances. Also the deviation (offset) in
feedforward control was eliminated.
Delays occur commonly in construction projects. Assessing the impact of delay is sometimes a contentious
issue. Several delay analysis methods are available but no one method can be universally used over another in
all situations. The selection of the proper analysis method depends upon a variety of factors including
information available, time of analysis, capabilities of the methodology, and time, funds and effort allocated to the analysis. This paper presents computerized schedule analysis programmed that use daily windows analysis method as it recognized one of the most credible methods, and it is one of the few techniques much more likely to be accepted by courts than any other method. A simple case study has been implement
Ultraviolet light radiation is applied to treat Plaque Psoriasis disease by targeted phototherapy. This is available through Narrowband-UVB light radiation devices peaked at wavelength 311 nm. Ten cases were chosen as a study group, 8 males aged 22-40 years old, and 2 females aged 25 and 32 years old who were exposed to ultraviolet light radiation. Their recovery or improvement was followed weekly. Different doses were used according to the severity of the lesion and as a trial for the outcome. The dose was given two times a week, starting with 200mJ/cm2, and subsequently increased by 100 or 200 mJ/cm2 reaching a maximum dose as tolerated by each individual patient. Improvement was observed after 4 – 6 weeks. The
... Show MoreAssociation rules mining (ARM) is a fundamental and widely used data mining technique to achieve useful information about data. The traditional ARM algorithms are degrading computation efficiency by mining too many association rules which are not appropriate for a given user. Recent research in (ARM) is investigating the use of metaheuristic algorithms which are looking for only a subset of high-quality rules. In this paper, a modified discrete cuckoo search algorithm for association rules mining DCS-ARM is proposed for this purpose. The effectiveness of our algorithm is tested against a set of well-known transactional databases. Results indicate that the proposed algorithm outperforms the existing metaheuristic methods.
Over the past few years, ear biometrics has attracted a lot of attention. It is a trusted biometric for the identification and recognition of humans due to its consistent shape and rich texture variation. The ear presents an attractive solution since it is visible, ear images are easily captured, and the ear structure remains relatively stable over time. In this paper, a comprehensive review of prior research was conducted to establish the efficacy of utilizing ear features for individual identification through the employment of both manually-crafted features and deep-learning approaches. The objective of this model is to present the accuracy rate of person identification systems based on either manually-crafted features such as D
... Show MoreFraud Includes acts involving the exercise of deception by multiple parties inside and outside companies in order to obtain economic benefits against the harm to those companies, as they are to commit fraud upon the availability of three factors which represented by the existence of opportunities, motivation, and rationalization. Fraud detecting require necessity of indications the possibility of its existence. Here, Benford’s law can play an important role in direct the light towards the possibility of the existence of financial fraud in the accounting records of the company, which provides the required effort and time for detect fraud and prevent it.
Social Networking has dominated the whole world by providing a platform of information dissemination. Usually people share information without knowing its truthfulness. Nowadays Social Networks are used for gaining influence in many fields like in elections, advertisements etc. It is not surprising that social media has become a weapon for manipulating sentiments by spreading disinformation. Propaganda is one of the systematic and deliberate attempts used for influencing people for the political, religious gains. In this research paper, efforts were made to classify Propagandist text from Non-Propagandist text using supervised machine learning algorithms. Data was collected from the news sources from July 2018-August 2018. After annota
... Show MoreThis dissertation depends on study of the topological structure in graph theory as well as introduce some concerning concepts, and generalization them into new topological spaces constructed using elements of graph. Thus, it is required presenting some theorems, propositions, and corollaries that are available in resources and proof which are not available. Moreover, studying some relationships between many concepts and examining their equivalence property like locally connectedness, convexity, intervals, and compactness. In addition, introducing the concepts of weaker separation axioms in α-topological spaces than the standard once like, α-feebly Hausdorff, α-feebly regular, and α-feebly normal and studying their properties. Furthermor
... Show More