The cost of pile foundations is part of the super structure cost, and it became necessary to reduce this cost by studying the pile types then decision-making in the selection of the optimal pile type in terms of cost and time of production and quality .So The main objective of this study is to solve the time–cost–quality trade-off (TCQT) problem by finding an optimal pile type with the target of "minimizing" cost and time while "maximizing" quality. There are many types In the world of piles but in this paper, the researcher proposed five pile types, one of them is not a traditional, and developed a model for the problem and then employed particle swarm optimization (PSO) algorithm, as one of evolutionary algorithms with the help of (Mat lab software), as a tool for decision making problem about choosing the best alternative of the traded piles, and proposes a multi objective optimization model, which aims to optimize the time, cost and quality of the pile types, and assist in selecting the most appropriate pile types. The researcher selected 10 of senior engineers to conduct interviews with them. And prepared some questions for interviews and open questionnaire. The individuals are selected from private and state sectors each one have 10 years or more experience in pile foundations work. From personal interviews and field survey the research has shown that most of the experts, engineers are not fully aware of new soft wear techniques to helps them in choosing alternatives, despite their belief in the usefulness of using modern technology and software. The Problem is multi objective optimization problem, so after running the PSO algorithm it is usual to have more than one optimal solution, for five proposed pile types, finally the researcher evaluated and discussed the output results and found out that pre-high tension spun (PHC)pile type was the optimal pile type.
The need to create the optimal water quality management process has motivated researchers to pursue prediction modeling development. One of the widely important forecasting models is the sessional autoregressive integrated moving average (SARIMA) model. In the present study, a SARIMA model was developed in R software to fit a time series data of monthly fluoride content collected from six stations on Tigris River for the period from 2004 to 2014. The adequate SARIMA model that has the least Akaike's information criterion (AIC) and mean squared error (MSE) was found to be SARIMA (2, 0, 0) (0,1,1). The model parameters were identified and diagnosed to derive the forecasting equations at each selected location. The correlat
... Show Morein this paper, we study and investigate a simple donor-acceptor model for charge transfer formation using a quantum transition theory. The transfer parameters which enhanced the charge transfer and the rate of the charge transfer have been calculated. Then, we study the net charge transfer through interface of Cu/F8 contact devices and evaluate all transfer coefficients. The charge transfer rate of transfer processes is found to be dominated in the low orientation free energy and increased a little in decreased potential at interface comparison to the high potential at interface. The increased transition energy results in increasing the orientation of Cu to F8. The transfer in the system was more active when the system has large driving for
... Show MoreMost of the Internet of Things (IoT), cell phones, and Radio Frequency Identification (RFID) applications need high speed in the execution and processing of data. this is done by reducing, system energy consumption, latency, throughput, and processing time. Thus, it will affect against security of such devices and may be attacked by malicious programs. Lightweight cryptographic algorithms are one of the most ideal methods Securing these IoT applications. Cryptography obfuscates and removes the ability to capture all key information patterns ensures that all data transfers occur Safe, accurate, verified, legal and undeniable. Fortunately, various lightweight encryption algorithms could be used to increase defense against various at
... Show MoreSpeech is the essential way to interact between humans or between human and machine. However, it is always contaminated with different types of environment noise. Therefore, speech enhancement algorithms (SEA) have appeared as a significant approach in speech processing filed to suppress background noise and return back the original speech signal. In this paper, a new efficient two-stage SEA with low distortion is proposed based on minimum mean square error sense. The estimation of clean signal is performed by taking the advantages of Laplacian speech and noise modeling based on orthogonal transform (Discrete Krawtchouk-Tchebichef transform) coefficients distribution. The Discrete Kra
For several applications, it is very important to have an edge detection technique matching human visual contour perception and less sensitive to noise. The edge detection algorithm describes in this paper based on the results obtained by Maximum a posteriori (MAP) and Maximum Entropy (ME) deblurring algorithms. The technique makes a trade-off between sharpening and smoothing the noisy image. One of the advantages of the described algorithm is less sensitive to noise than that given by Marr and Geuen techniques that considered to be the best edge detection algorithms in terms of matching human visual contour perception.
The centers of cities and historical quarter are exposed to a severe threat to the values of the physical and legal urban environment as a result of the value deterioration and the emergence, emergence and spread of new values on the intellectual and urban context, which generates the loss of the urban environment for its spatio-temporal continuity, flexibility, adaptation and continuity, and thus urban obsolescence, Hence the problem of the research in “the lack of comprehensiveness of studies on the phenomenon of urban obsolescence and its impact on the decline in the values of the quality of the built environment in historic
... Show More