An application of neural network technique was introduced in modeling the point efficiency of sieve tray, based on a
data bank of around 33l data points collected from the open literature.Two models proposed,using back-propagation
algorithm, the first model network consists: volumetric liquid flow rate (QL), F foctor for gas (FS), liquid density (pL),
gas density (pg), liquid viscosity (pL), gas viscosity (pg), hole diameter (dH), weir height (hw), pressure (P) and surface
tension between liquid phase and gas phase (o). In the second network, there are six parameters as dimensionless
group: Flowfactor (F), Reynolds number for liquid (ReL), Reynolds number for gas through hole (Reg), ratio of weir
height to hole diqmeter (hw/dH), ratio of pressure of process to atmosphere pressure (P/Pa), Weber number (lTe).
Statistical analysis showed that the proposed models have an average absolute relative error (AARE) of 9.3% and
standard deviation (SD) of 9.7%for first model, AARE of 9.35% and SD of 10.5%for second model and AARE of 9.8%
and SD of 7.5% for the third model.
The Khor Mor gas-condensate processing plant in Iraq is currently facing operational challenges due to foaming issues in the sweetening tower caused by high-soluble hydrocarbon liquids entering the tower. The root cause of the problem could be liquid carry-over as the separation vessels within the plant fail to remove liquid droplets from the gas phase. This study employs Aspen HYSYS v.11 software to investigate the performance of the industrial three-phase horizontal separator, Bravo #2, located upstream of the Khor Mor sweetening tower, under both current and future operational conditions. The simulation results, regarding the size distribution of liquid droplets in the gas product and the efficiency gas/liquid separation, r
... Show MoreIn this paper, one of the Machine Scheduling Problems is studied, which is the problem of scheduling a number of products (n-jobs) on one (single) machine with the multi-criteria objective function. These functions are (completion time, the tardiness, the earliness, and the late work) which formulated as . The branch and bound (BAB) method are used as the main method for solving the problem, where four upper bounds and one lower bound are proposed and a number of dominance rules are considered to reduce the number of branches in the search tree. The genetic algorithm (GA) and the particle swarm optimization (PSO) are used to obtain two of the upper bounds. The computational results are calculated by coding (progr
... Show MoreMost recognition system of human facial emotions are assessed solely on accuracy, even if other performance criteria are also thought to be important in the evaluation process such as sensitivity, precision, F-measure, and G-mean. Moreover, the most common problem that must be resolved in face emotion recognition systems is the feature extraction methods, which is comparable to traditional manual feature extraction methods. This traditional method is not able to extract features efficiently. In other words, there are redundant amount of features which are considered not significant, which affect the classification performance. In this work, a new system to recognize human facial emotions from images is proposed. The HOG (Histograms of Or
... Show MoreStrong and ∆-convergence for a two-step iteration process utilizing asymptotically nonexpansive and total asymptotically nonexpansive noneslf mappings in the CAT(0) spaces have been studied. As well, several strong convergence theorems under semi-compact and condition (M) have been proved. Our results improve and extend numerous familiar results from the existing literature.
In the present work, an image compression method have been modified by combining The Absolute Moment Block Truncation Coding algorithm (AMBTC) with a VQ-based image coding. At the beginning, the AMBTC algorithm based on Weber's law condition have been used to distinguish low and high detail blocks in the original image. The coder will transmit only mean of low detailed block (i.e. uniform blocks like background) on the channel instate of transmit the two reconstruction mean values and bit map for this block. While the high detail block is coded by the proposed fast encoding algorithm for vector quantized method based on the Triangular Inequality Theorem (TIE), then the coder will transmit the two reconstruction mean values (i.e. H&L)
... Show MoreThis paper investigates some exact and local search methods to solve the traveling salesman problem. The Branch and Bound technique (BABT) is proposed, as an exact method, with two models. In addition, the classical Genetic Algorithm (GA) and Simulated Annealing (SA) are discussed and applied as local search methods. To improve the performance of GA we propose two kinds of improvements for GA; the first is called improved GA (IGA) and the second is Hybrid GA (HGA).
The IGA gives best results than GA and SA, while the HGA is the best local search method for all within a reasonable time for 5 ≤ n ≤ 2000, where n is the number of visited cities. An effective method of reducing the size of the TSP matrix was proposed with
... Show MoreIn this review paper, several studies and researches were surveyed for assisting future researchers to identify available techniques in the field of classification of Synthetic Aperture Radar (SAR) images. SAR images are becoming increasingly important in a variety of remote sensing applications due to the ability of SAR sensors to operate in all types of weather conditions, including day and night remote sensing for long ranges and coverage areas. Its properties of vast planning, search, rescue, mine detection, and target identification make it very attractive for surveillance and observation missions of Earth resources. With the increasing popularity and availability of these images, the need for machines has emerged to enhance t
... Show MoreThis research investigates solid waste management in Al-Kut City. It included the collection of medical and general solid waste generated in five hospitals different in their specialization and capacity through one week, starting from 03/02/2012. Samples were collected and analyzed periodically to find their generation rate, composition, and physical properties. Analysis results indicated that generation rate ranged between (1102 – 212) kg / bed / day, moisture content and density were (19.0 % - 197 kg/ m3) respectively for medical waste and (41%-255 kg/ m3) respectively for general waste. Theoretically, medical solid waste generated in Al-Kut City (like any other city), affected by capacity, number of patients in a day, and hosp
... Show MoreDust is a frequent contributor to health risks and changes in the climate, one of the most dangerous issues facing people today. Desertification, drought, agricultural practices, and sand and dust storms from neighboring regions bring on this issue. Deep learning (DL) long short-term memory (LSTM) based regression was a proposed solution to increase the forecasting accuracy of dust and monitoring. The proposed system has two parts to detect and monitor the dust; at the first step, the LSTM and dense layers are used to build a system using to detect the dust, while at the second step, the proposed Wireless Sensor Networks (WSN) and Internet of Things (IoT) model is used as a forecasting and monitoring model. The experiment DL system
... Show MoreMulti-document summarization is an optimization problem demanding optimization of more than one objective function simultaneously. The proposed work regards balancing of the two significant objectives: content coverage and diversity when generating summaries from a collection of text documents.
Any automatic text summarization system has the challenge of producing high quality summary. Despite the existing efforts on designing and evaluating the performance of many text summarization techniques, their formulations lack the introduction of any model that can give an explicit representation of – coverage and diversity – the two contradictory semantics of any summary. In this work, the design of
... Show More