The population has been trying to use clean energy instead of combustion. The choice was to use liquefied petroleum gas (LPG) for domestic use, especially for cooking due to its advantages as a light gas, a lower cost, and clean energy. Residential complexes are supplied with liquefied petroleum gas for each housing unit, transported by pipes from LPG tanks to the equipment. This research aims to simulate the design and performance design of the LPG system in the building that is applied to a residential complex in Baghdad taken as a study case with eight buildings. The building has 11 floors, and each floor has four apartments. The design in this study has been done in two parts, part one is the design of an LPG system for one building, and the second part is the design of an LPG system for a complex containing eight buildings. The results were obtained by using mathematical equations and using the Pipe Flow expert v7.30 program to design and analyze with explaining steps in the program to design.
In the presence of deep submicron noise, providing reliable and energy‐efficient network on‐chip operation is becoming a challenging objective. In this study, the authors propose a hybrid automatic repeat request (HARQ)‐based coding scheme that simultaneously reduces the crosstalk induced bus delay and provides multi‐bit error protection while achieving high‐energy savings. This is achieved by calculating two‐dimensional parities and duplicating all the bits, which provide single error correction and six errors detection. The error correction reduces the performance degradation caused by retransmissions, which when combined with voltage swing reduction, due to its high error detection, high‐energy savings are achieved. The res
... Show MoreThis study found that one of the constructive, necessary, beneficial, most effective, and cost-effective ways to meet the great challenge of rising energy prices is to develop and improve energy quality and efficiency. The process of improving the quality of energy and its means has been carried out in many buildings and around the world. It was found that the thermal insulation process in buildings and educational facilities has become the primary tool for improving energy efficiency, enabling us to improve and develop the internal thermal environment quality processes recommended for users (student - teacher). An excellent and essential empirical study has been conducted to calculate the fundamental values of the
... Show MoreThe main aim of this paper is to study how the different estimators of the two unknown parameters (shape and scale parameter) of a generalized exponential distribution behave for different sample sizes and for different parameter values. In particular,
. Maximum Likelihood, Percentile and Ordinary Least Square estimators had been implemented for different sample sizes (small, medium, and large) and assumed several contrasts initial values for the two parameters. Two indicators of performance Mean Square Error and Mean Percentile Error were used and the comparisons were carried out between different methods of estimation by using monte carlo simulation technique .. It was obse
... Show MoreWireless Body Area Sensor Network (WBASN) is gaining significant attention due to its applications in smart health offering cost-effective, efficient, ubiquitous, and unobtrusive telemedicine. WBASNs face challenges including interference, Quality of Service, transmit power, and resource constraints. Recognizing these challenges, this paper presents an energy and Quality of Service-aware routing algorithm. The proposed algorithm is based on each node's Collaboratively Evaluated Value (CEV) to select the most suitable cluster head (CH). The Collaborative Value (CV) is derived from three factors, the node's residual energy, the distance vector between nodes and personal device, and the sensor's density in each CH. The CEV algorithm operates i
... Show MoreThis article proposes a new technique for determining the rate of contamination. First, a generative adversarial neural network (ANN) parallel processing technique is constructed and trained using real and secret images. Then, after the model is stabilized, the real image is passed to the generator. Finally, the generator creates an image that is visually similar to the secret image, thus achieving the same effect as the secret image transmission. Experimental results show that this technique has a good effect on the security of secret information transmission and increases the capacity of information hiding. The metric signal of noise, a structural similarity index measure, was used to determine the success of colour image-hiding t
... Show MoreThis research aims to develop new spectrophotometric analytical method to determine drug compound Salbutamol by reaction it with ferric chloride in presence potassium ferricyanide in acid median to formation of Prussian blue complex to determine it by uv-vis spectrophotmetric at wavelengths rang(700-750)nm . Study the optimal experimental condition for determination drug and found the follows: 1- Volume of(10M) H2SO4 to determine of drug is 1.5 ml . 2- Volume and concentration of K3Fe(CN)6 is 1.5 ml ,0.2% . 3- Volume and concentration of FeCl3 is 2.5ml , 0.2%. 4- Temperature has been found 80 . 5- Reaction time is 15 minute . 6- Order of addition is (drug + K3Fe(CN)6+ FeCl3 + acid) . Concentration rang (0.025-5 ppm) , limit detecti
... Show MoreWellbore instability and sand production onset modeling are very affected by Sonic Shear Wave Time (SSW). In any field, SSW is not available for all wells due to the high cost of measuring. Many authors developed empirical correlations using information from selected worldwide fields for SSW prediction. Recently, researchers have used different Artificial Intelligence methods for estimating SSW. Three existing empirical correlations of Carroll, Freund, and Brocher are used to estimate SSW in this paper, while a fourth new empirical correlation is established. For comparing with the empirical correlation results, another study's Artificial Neural Network (ANN) was used. The same data t
... Show MoreThe traditional centralized network management approach presents severe efficiency and scalability limitations in large scale networks. The process of data collection and analysis typically involves huge transfers of management data to the manager which cause considerable network throughput and bottlenecks at the manager side. All these problems processed using the Agent technology as a solution to distribute the management functionality over the network elements. The proposed system consists of the server agent that is working together with clients agents to monitor the logging (off, on) of the clients computers and which user is working on it. file system watcher mechanism is used to indicate any change in files. The results were presente
... Show More