Preferred Language
Articles
/
joe-606
Determination of Best Location for Elevated Tank in Branched Network
...Show More Authors

The research focuses on determination of best location of high elevated tank using the required head of pump as a measure for this purpose. Five types of network were used to find the effect of the variation in the discharge and the node elevation on the best location. The most weakness point was determined for each network. Preliminary tank locations were chosen for test along the primary pipe with same interval distance. For each location, the water elevation in tank and pump head was calculated at each hour depending on the pump head that required to achieve the minimum pressure at the most weakness point. Then, the sum of pump heads through the day was determined. The results proved that there is a most economical location where the energy consumption is minimum. This location joined with the branched line that containing the most weakness point. The best location didn’t join with the highest demand location unless this location containing the most weakness point.  The results indicated that the moving of tank away from best location in pump direction result in pump head increasing that exceed the increasing in pump head when the tank moves in the opposite direction. The location of tank beside the pump station was the worst location. Also, the results showed that as the distance between the pump and the highest demand become shorter, the required pump head become less. The uniform demand distribution required the least amount of pump head, it required minimum head of (554)m while the networks, that have highest demand at distance 200m,400m, and 1000m from the pump station,  required minimum head of 651m, 682m, and 726m respectively.

 

Crossref
View Publication Preview PDF
Quick Preview PDF
Publication Date
Thu Jun 01 2017
Journal Name
Journal Of Economics And Administrative Sciences
A Comparison Between Maximum Likelihood Method And Bayesian Method For Estimating Some Non-Homogeneous Poisson Processes Models
...Show More Authors

Abstract

The Non - Homogeneous Poisson  process is considered  as one of the statistical subjects which had an importance in other sciences and a large application in different areas as waiting raws and rectifiable systems method , computer and communication systems and the theory of reliability and many other, also it used in modeling the phenomenon that occurred by unfixed way over time (all events that changed by time).

This research deals with some of the basic concepts that are related to the Non - Homogeneous Poisson process , This research carried out two models of the Non - Homogeneous Poisson process which are the power law model , and Musa –okumto ,   to estimate th

... Show More
View Publication Preview PDF
Crossref
Publication Date
Fri Jul 27 2012
Journal Name
Journal Of Prosthodontics
A Three-Dimensional Finite Element Analysis for Overdenture Attachments Supported by Teeth and/or Mini Dental Implants
...Show More Authors

View Publication
Crossref (15)
Crossref
Publication Date
Sun Sep 01 2013
Journal Name
Baghdad Science Journal
Synthesis and biological studies for some heterocyclic compounds derived from 2-Morpholino-1,8- naphthyridine-4-carboxylic acid
...Show More Authors

New heterocyclic compounds derived from 2-Morpholino-1,8-naphthyridine-4-carboxylic acid such as oxadiazolo, thiadiazolo – thione and triazolo-thione have been prepared and characterized on the basis of IR and 1H NMR spectra data. The hydrizide compound was utilized as a starting material for preparing of these compounds. The second part of this study involves the biological studies of some of these naphthyridine derivatives by using three different kinds of bacteria namely: Staphylococcus aureus, Pseudomonas aeruglnosa and Escherichia coli. The data indicated that some of these compounds have a good activity against the tested bacteria in comparison to antibiotics.

View Publication Preview PDF
Crossref (1)
Crossref
Publication Date
Sun Feb 25 2024
Journal Name
Baghdad Science Journal
Simplified Novel Approach for Accurate Employee Churn Categorization using MCDM, De-Pareto Principle Approach, and Machine Learning
...Show More Authors

Churning of employees from organizations is a serious problem. Turnover or churn of employees within an organization needs to be solved since it has negative impact on the organization. Manual detection of employee churn is quite difficult, so machine learning (ML) algorithms have been frequently used for employee churn detection as well as employee categorization according to turnover. Using Machine learning, only one study looks into the categorization of employees up to date.  A novel multi-criterion decision-making approach (MCDM) coupled with DE-PARETO principle has been proposed to categorize employees. This is referred to as SNEC scheme. An AHP-TOPSIS DE-PARETO PRINCIPLE model (AHPTOPDE) has been designed that uses 2-stage MCDM s

... Show More
View Publication Preview PDF
Scopus (5)
Crossref (4)
Scopus Crossref
Publication Date
Sat Apr 01 2017
Journal Name
2017 International Conference On Environmental Impacts Of The Oil And Gas Industries: Kurdistan Region Of Iraq As A Case Study (eiogi)
Inverse fluidized bed for chromium ions removal from wastewater and produced water using peanut shells as adsorbent
...Show More Authors

View Publication
Scopus (5)
Crossref (1)
Scopus Crossref
Publication Date
Thu Feb 07 2019
Journal Name
Iraqi Journal Of Laser
Tapered Splicing Points SMF-PCF-SMF Structure based on Mach-Zehnder interferometer for Enhanced Refractive Index Sensing
...Show More Authors

Photonic crystal fiber interferometers (PCFIs) are widely used for sensing applications. This work presented solid core-PCFs based on Mach-Zehnder modal interferometer for sensing refractive index. The general structure of sensor was applied by splicing short lengths of PCF in both sides with conventional single mode fiber (SMF-28).To apply modal interferometer theory collapsing technique based on fusion splicing used to excite higher order modes (LP01 and LP11). A high sensitive optical spectrum analyzer (OSA) was used to monitor and record the transmitted wavelength. This work studied a Mach-Zahnder interferometer refractive index sensor based on splicing point tapered SMF-PCF-SMF. Relation between refractive index sensitivity and tape

... Show More
View Publication Preview PDF
Publication Date
Sat Oct 31 2020
Journal Name
International Journal Of Intelligent Engineering And Systems
Automatic Computer Aided Diagnostic for COVID-19 Based on Chest X-Ray Image and Particle Swarm Intelligence
...Show More Authors

View Publication
Scopus (24)
Crossref (6)
Scopus Crossref
Publication Date
Wed Mar 01 2017
Journal Name
Archive Of Mechanical Engineering
Using the Lid-Driven Cavity Flow to Validate Moment-Based Boundary Conditions for the Lattice Boltzmann Equation
...Show More Authors
Abstract<p>The accuracy of the Moment Method for imposing no-slip boundary conditions in the lattice Boltzmann algorithm is investigated numerically using lid-driven cavity flow. Boundary conditions are imposed directly upon the hydrodynamic moments of the lattice Boltzmann equations, rather than the distribution functions, to ensure the constraints are satisfied precisely at grid points. Both single and multiple relaxation time models are applied. The results are in excellent agreement with data obtained from state-of-the-art numerical methods and are shown to converge with second order accuracy in grid spacing.</p>
View Publication
Scopus (23)
Crossref (19)
Scopus Crossref
Publication Date
Tue Jun 01 2021
Journal Name
Baghdad Science Journal
Comparing Weibull Stress – Strength Reliability Bayesian Estimators for Singly Type II Censored Data under Different loss Functions
...Show More Authors

     The stress(Y) – strength(X) model reliability Bayesian estimation which defines life of a component with strength X and stress Y (the component fails if and only if at any time the applied stress is greater than its strength) has been studied, then the reliability; R=P(Y<X), can be considered as a measure of the component performance. In this paper, a Bayesian analysis has been considered for R when the two variables X and Y are independent Weibull random variables with common parameter α in order to study the effect of each of the two different scale parameters β and λ; respectively, using three different [weighted, quadratic and entropy] loss functions under two different prior functions [Gamma and extension of Jeffery

... Show More
View Publication Preview PDF
Scopus (4)
Scopus Clarivate Crossref
Publication Date
Thu Dec 01 2022
Journal Name
Baghdad Science Journal
Comparison between RSA and CAST-128 with Adaptive Key for Video Frames Encryption with Highest Average Entropy
...Show More Authors

Encryption of data is translating data to another shape or symbol which enables people only with an access to the secret key or a password that can read it. The data which are encrypted are generally referred to as cipher text, while data which are unencrypted are known plain text. Entropy can be used as a measure which gives the number of bits that are needed for coding the data of an image. As the values of pixel within an image are dispensed through further gray-levels, the entropy increases. The aim of this research is to compare between CAST-128 with proposed adaptive key and RSA encryption methods for video frames to determine the more accurate method with highest entropy. The first method is achieved by applying the "CAST-128" and

... Show More
View Publication Preview PDF
Scopus (8)
Crossref (2)
Scopus Clarivate Crossref