Projects suspensions are between the most insistent tasks confronted by the construction field accredited to the sector’s difficulty and its essential delay risk foundations’ interdependence. Machine learning provides a perfect group of techniques, which can attack those complex systems. The study aimed to recognize and progress a wellorganized predictive data tool to examine and learn from delay sources depend on preceding data of construction projects by using decision trees and naïve Bayesian classification algorithms. An intensive review of available data has been conducted to explore the real reasons and causes of construction project delays. The results show that the postpo
In this study, a fast block matching search algorithm based on blocks' descriptors and multilevel blocks filtering is introduced. The used descriptors are the mean and a set of centralized low order moments. Hierarchal filtering and MAE similarity measure were adopted to nominate the best similar blocks lay within the pool of neighbor blocks. As next step to blocks nomination the similarity of the mean and moments is used to classify the nominated blocks and put them in one of three sub-pools, each one represents certain nomination priority level (i.e., most, less & least level). The main reason of the introducing nomination and classification steps is a significant reduction in the number of matching instances of the pixels belong to the c
... Show MoreAbstract
The Non - Homogeneous Poisson process is considered as one of the statistical subjects which had an importance in other sciences and a large application in different areas as waiting raws and rectifiable systems method , computer and communication systems and the theory of reliability and many other, also it used in modeling the phenomenon that occurred by unfixed way over time (all events that changed by time).
This research deals with some of the basic concepts that are related to the Non - Homogeneous Poisson process , This research carried out two models of the Non - Homogeneous Poisson process which are the power law model , and Musa –okumto , to estimate th
... Show MoreThe stress(Y) – strength(X) model reliability Bayesian estimation which defines life of a component with strength X and stress Y (the component fails if and only if at any time the applied stress is greater than its strength) has been studied, then the reliability; R=P(Y<X), can be considered as a measure of the component performance. In this paper, a Bayesian analysis has been considered for R when the two variables X and Y are independent Weibull random variables with common parameter α in order to study the effect of each of the two different scale parameters β and λ; respectively, using three different [weighted, quadratic and entropy] loss functions under two different prior functions [Gamma and extension of Jeffery
... Show MoreThe research aims at a statement Internal Debt options during shocks and the impact of this borrowing in the volume of the foreign reserve, using induction and deduction with available data analysis. During the period (2004-2013) did not require the use of borrowing across (financial institutions, discounted transfers, bonds); it was only sufficient by transfer with commercial banks that can finance of temporary budget deficits: rose and decline of volume of foreign reserve according to the changes of oil prices and the volume of purchases and sales of the Central Bank of Iraq. Central Bank of Iraq (CBI) has significantly contributed to Internal Debt through bond and discounted transfers in the secondary market; thus, funding the
... Show MoreIn this paper, a compact genetic algorithm (CGA) is enhanced by integrating its selection strategy with a steepest descent algorithm (SDA) as a local search method to give I-CGA-SDA. This system is an attempt to avoid the large CPU time and computational complexity of the standard genetic algorithm. Here, CGA dramatically reduces the number of bits required to store the population and has a faster convergence. Consequently, this integrated system is used to optimize the maximum likelihood function lnL(φ1, θ1) of the mixed model. Simulation results based on MSE were compared with those obtained from the SDA and showed that the hybrid genetic algorithm (HGA) and I-CGA-SDA can give a good estimator of (φ1, θ1) for the ARMA(1,1) model. Anot
... Show MoreThe aim of the current study was to develop a nanostructured double-layer for hydrophobic molecules delivery system. The developed double-layer consisted of polyethylene glycol-based polymeric (PEG) followed by gelatin sub coating of the core hydrophobic molecules containing sodium citrate. The polymeric composition ratio of PEG and the amount of the sub coating gelatin were optimized using the two-level fractional method. The nanoparticles were characterized using AFM and FT-IR techniques. The size of these nano capsules was in the range of 39-76 nm depending on drug loading concentration. The drug was effectively loaded into PEG-Gelatin nanoparticles (≈47%). The hydrophobic molecules-release characteristics in terms of controlled-releas
... Show MoreThe presented work shows a preliminary analytic method for estimation of load and pressure distributions on low speed wings with flow separation and wake rollup phenomena’s. A higher order vortex panel method is coupled with the numerical lifting line theory by means of iterative procedure including models of separation and wake rollup. The computer programs are written in FORTRAN which are stable and efficient.
The capability of the present method is investigated through a number of test cases with different types of wing sections (NACA 0012 and GA(W)-1) for different aspect ratios and angles of attack, the results include the lift and drag curves, lift and pressure distributions along the wing s
... Show MoreMany numerical approaches have been suggested to solve nonlinear problems. In this paper, we suggest a new two-step iterative method for solving nonlinear equations. This iterative method has cubic convergence. Several numerical examples to illustrate the efficiency of this method by Comparison with other similar methods is given.
This paper reports a fiber Bragg grating (FBG) as a biosensor. The FBGs were etched using a chemical agent,namely,hydrofluoric acid (HF). This implies the removal of some part of the cladding layer. Consequently, the evanescent field propagating out of the core will be closer to the environment and become more sensitive to the change in the surrounding. The proposed FBG sensor was utilized to detect toxic heavy metal ions aqueous medium namely, copper ions (Cu2+). Two FBG sensors were etched with 20 and 40 μm diameters and fabricated. The sensors were studied towards Cu2+ with different concentrations using wavelength shift as a result of the interaction between the evanescent field and copper ions. The FBG sensors showed
... Show More