Many consumers of electric power have excesses in their electric power consumptions that exceed the permissible limit by the electrical power distribution stations, and then we proposed a validation approach that works intelligently by applying machine learning (ML) technology to teach electrical consumers how to properly consume without wasting energy expended. The validation approach is one of a large combination of intelligent processes related to energy consumption which is called the efficient energy consumption management (EECM) approaches, and it connected with the internet of things (IoT) technology to be linked to Google Firebase Cloud where a utility center used to check whether the consumption of the efficient energy is satisfied. It divides the measured data for actual power (A_p ) of the electrical model into two portions: the training portion is selected for different maximum actual powers, and the validation portion is determined based on the minimum output power consumption and then used for comparison with the actual required input power. Simulation results show the energy expenditure problem can be solved with good accuracy in energy consumption by reducing the maximum rate (A_p ) in a given time (24) hours for a single house, as well as electricity’s bill cost, is reduced.
That the essential contribution of this research is a description of how complex systems analysis service of the properties of the queue in Baghdad Teaching Hospital using a technique network is techniques method (Q - GERT) an acronym of the words:
Queuing theory _ Graphical Evaluation and Review Technique
Any method of assessment and review chart where you will be see the movement flow of patients within the system and after using this portal will be represented system in the form of planned network probabilistic analysis and knowledge of statistical distributions appropriate for times of arrival and departure were using the program ready (Win QSB) and simulatio
... Show MoreImage compression is a suitable technique to reduce the storage space of an image, increase the area of storage in the device, and speed up the transmission process. In this paper, a new idea for image compression is proposed to improve the performance of the Absolute Moment Block Truncation Coding (AMBTC) method depending on Weber's law condition to distinguish uniform blocks (i.e., low and constant details blocks) from non-uniform blocks in original images. Then, all elements in the bitmap of each uniform block are represented by zero. After that, the lossless method, which is Run Length method, is used for compressing the bits more, which represent the bitmap of these uniform blocks. Via this simple idea, the result is improving
... Show MoreOne of the most important challenges facing the designers of the sewerage system is the corrosion of sewers due to the influence of sewerage contaminates which lead to failure of the main lines of sewers. In this study, a reference mix of 1: 1.5: 3 was used and the 4% Flocrete PC200 by weight of cement was added to the same mixing ratio in the second mixture. Twenty-four samples were tested for each mixture, 12 of which were used to compression strength test in ages (7, 14 and 28) day and six samples were submerged after 28 days of wet treatment at (5 and 10) % concentrations of sulfuric acid. The other six samples were painted after 28 days of wet treatment with coating Polyurethane and after 24 hours were flooded with a concentrat
... Show MoreThe quality of groundwater in the Al-Hawija area was assessed using a water quality index. Data of nine physico-chemical parameters of 28 groundwater wells were used to calculate the water quality index (WQI). A heterogeneous water quality was reported, where in close proximity to the Lesser Zab River (LZR), it has low WQI values and permissible for human consumptions due to the dilution processes by fresh water; whereas, it becomes deteriorated in areas located far away the river. The values of WQI ranges from 22 to 336, indicating a good to very poor groundwater quality.
The current research creates an overall relative analysis concerning the estimation of Meixner process parameters via the wavelet packet transform. Of noteworthy presentation relevance, it compares the moment method and the wavelet packet estimator for the four parameters of the Meixner process. In this paper, the research focuses on finding the best threshold value using the square root log and modified square root log methods with the wavelet packets in the presence of noise to enhance the efficiency and effectiveness of the denoising process for the financial asset market signal. In this regard, a simulation study compares the performance of moment estimation and wavelet packets for different sample sizes. The results show that wavelet p
... Show MoreAdsorption techniques are widely used to remove certain classes of pollutants from wastewater. Phenolic compounds represent one of the problematic groups. Na-Y zeolite has been synthesized from locally available Iraqi kaolin clay. Characterization of the prepared zeolite was made by XRD and surface area measurement using N2 adsorption. Both synthetic Na-Y zeolite and kaolin clay have been tested for adsorption of 4-Nitro-phenol in batch mode experiments. Maximum removal efficiencies of 90% and 80% were obtained using the prepared zeolite and kaolin clay, respectively. Kinetics and equilibrium adsorption isotherms were investigated. Investigations showed that both Langmuir and Freundlich isotherms fit the experimental data quite well. On the
... Show MoreThe assessment of data quality from different sources can be considered as a key challenge in supporting effective geospatial data integration and promoting collaboration in mapping projects. This paper presents a methodology for assessing positional and shape quality for authoritative large-scale data, such as Ordnance Survey (OS) UK data and General Directorate for Survey (GDS) Iraq data, and Volunteered Geographic Information (VGI), such as OpenStreetMap (OSM) data, with the intention of assessing possible integration. It is based on the measurement of discrepancies among the datasets, addressing positional accuracy and shape fidelity, using standard procedures and also directional statistics. Line feature comparison has been und
... Show More