Risk identification and assessment can be analysed using many risk management tools. Fishbone diagram is one of these techniques which can be employed, for the identification of the causes behind the construction failure, which has become a phenomenon that often gets repeated in several projects. If these failures are not understood and handled scientifically, it may lead to disputes between the project parties. Additionally, the construction failure also leads to an increase in the project budget, which in turn causes a delay in the completion of the projects. Punching shear in reinforcement slab may be one of the reasons for construction failures. However, there are many doubts about other causes that lead to this failure as well as the role of these causes in the construction failure. Also, there are many causes linked to this failure of which some fall on the designer and the others fall on the contractor. Thus, this research aims to determine the causes of punching shear failure in the concrete slab and its role in the failure using a logic managerial analysis. For this purpose, the applicability of the Fishbone diagram has been extended, for the analysis of probability as well as the impact of the risk of punching shear, thus elucidating the risk score of each category without ignoring the global risk. In this direction, interviews and questionnaires are conducted with numerous experts specialize in both the design and execution field of construction projects for identifying the most important causes that lead to the occurrence of punching shear failure. Further, the Fishbone diagram for punching shear’s risk illuminated that impact of some of the primary and secondary causes such as planning, designing, and maintenance is more than the expectation. Therefore, the concentration in these areas should be carried out by taking into consideration the adapt risk response plan to prevent or mitigate these risks.
In this research Artificial Neural Network (ANN) technique was applied to study the filtration process in water treatment. Eight models have been developed and tested using data from a pilot filtration plant, working under different process design criteria; influent turbidity, bed depth, grain size, filtration rate and running time (length of the filtration run), recording effluent turbidity and head losses. The ANN models were constructed for the prediction of different performance criteria in the filtration process: effluent turbidity, head losses and running time. The results indicate that it is quite possible to use artificial neural networks in predicting effluent turbidity, head losses and running time in the filtration process, wi
... Show MoreThis investigation was carried out to study the treatment and recycling of wastewater in the cotton textile industry for an effluent containing three dyes: direct blue, sulphur black and vat yellow. The reuse of such effluent can only be made possible by appropriate treatment method such as chemical coagulation. Ferrous and ferric sulphate with and without calcium hydroxide were employed in this study as the chemical coagulants.
The results showed that the percentage removal of direct blue ranged between 91.4 and 94 , for sulphur black ranged between 98.7 and 99.5 while for vat yellow it was between 97 and 99.
In this paper, new method have been investigated using evolving algorithms (EA's) to cryptanalysis one of the nonlinear stream cipher cryptosystems which depends on the Linear Feedback Shift Register (LFSR) unit by using cipher text-only attack. Genetic Algorithm (GA) and Ant Colony Optimization (ACO) which are used for attacking one of the nonlinear cryptosystems called "shrinking generator" using different lengths of cipher text and different lengths of combined LFSRs. GA and ACO proved their good performance in finding the initial values of the combined LFSRs. This work can be considered as a warning for a stream cipher designer to avoid the weak points, which may be f
... Show MoreOver the past few years, ear biometrics has attracted a lot of attention. It is a trusted biometric for the identification and recognition of humans due to its consistent shape and rich texture variation. The ear presents an attractive solution since it is visible, ear images are easily captured, and the ear structure remains relatively stable over time. In this paper, a comprehensive review of prior research was conducted to establish the efficacy of utilizing ear features for individual identification through the employment of both manually-crafted features and deep-learning approaches. The objective of this model is to present the accuracy rate of person identification systems based on either manually-crafted features such as D
... Show MoreThe area of character recognition has received a considerable attention by researchers all over the world during the last three decades. However, this research explores best sets of feature extraction techniques and studies the accuracy of well-known classifiers for Arabic numeral using the Statistical styles in two methods and making comparison study between them. First method Linear Discriminant function that is yield results with accuracy as high as 90% of original grouped cases correctly classified. In the second method, we proposed algorithm, The results show the efficiency of the proposed algorithms, where it is found to achieve recognition accuracy of 92.9% and 91.4%. This is providing efficiency more than the first method.
: Sound forecasts are essential elements of planning, especially for dealing with seasonality, sudden changes in demand levels, strikes, large fluctuations in the economy, and price-cutting manoeuvres for competition. Forecasting can help decision maker to manage these problems by identifying which technologies are appropriate for their needs. The proposal forecasting model is utilized to extract the trend and cyclical component individually through developing the Hodrick–Prescott filter technique. Then, the fit models of these two real components are estimated to predict the future behaviour of electricity peak load. Accordingly, the optimal model obtained to fit the periodic component is estimated using spectrum analysis and Fourier mod
... Show MoreSmart water flooding (low salinity water flooding) was mainly invested in a sandstone reservoir. The main reasons for using low salinity water flooding are; to improve oil recovery and to give a support for the reservoir pressure.
In this study, two core plugs of sandstone were used with different permeability from south of Iraq to explain the effect of water injection with different ions concentration on the oil recovery. Water types that have been used are formation water, seawater, modified low salinity water, and deionized water.
The effects of water salinity, the flow rate of water injected, and the permeability of core plugs have been studied in order to summarize the best conditions of low salinity
... Show MoreIn this paper, we used four classification methods to classify objects and compareamong these methods, these are K Nearest Neighbor's (KNN), Stochastic Gradient Descentlearning (SGD), Logistic Regression Algorithm(LR), and Multi-Layer Perceptron (MLP). Weused MCOCO dataset for classification and detection the objects, these dataset image wererandomly divided into training and testing datasets at a ratio of 7:3, respectively. In randomlyselect training and testing dataset images, converted the color images to the gray level, thenenhancement these gray images using the histogram equalization method, resize (20 x 20) fordataset image. Principal component analysis (PCA) was used for feature extraction, andfinally apply four classification metho
... Show More