Peak ground acceleration (PGA) is one of the critical factors that affect the determination of earthquake intensity. PGA is generally utilized to describe ground motion in a particular zone and is able to efficiently predict the parameters of site ground motion for the design of engineering structures. Therefore, novel models are developed to forecast PGA in the case of the Iraqi database, which utilizes the particle swarm optimization (PSO) approach. A data set of 187 historical ground-motion recordings in Iraq’s tectonic regions was used to build the explicit proposed models. The proposed PGA models relate to different seismic parameters, including the magnitude of the earthquake (Mw), average shear-wave velocity (VS30), focal depth (FD), and nearest epicenter distance (REPi) to a seismic station. The derived PGA models are remarkably simple and straightforward and can be used reliably for pre-design purposes. The proposed PGA models (i.e., models I and II) obtained via the explicit formula produced using the PSO method are highly correlated to the actual PGA records owing to low coefficients of variation (CoV) of approximately 2.12% and 2.06%, and mean values (i.e., close to 1.0) of approximately 1.005 and 1.004. Lastly, high-frequency, low absolute relative error (ARE), which is below 5%, is recorded for the proposed models, thereby showing an acceptable error distribution.
The Bouguer gravity and magnetic RTP anomalies data were used to detect the main tectonic boundaries of middle and south of Diyala Province, east Iraq. Window method was used to separate the residual anomalies using different space windows for the Bouguer and Magnetic RTP maps. The residual anomaly processed in order to reduce noise and give a more comprehensive vision about subsurface lineaments structures. Results for descriptive interpretation presented as contour maps in order to locate directions and extensions of lineaments feature which may interpret as faults. The gradient technique is used for depth estimation of some gravity source which shows that the sources depth range between (13.65
... Show MoreA 3D velocity model was created by using stacking velocity of 9 seismic lines and average velocity of 6 wells drilled in Iraq. The model was achieved by creating a time model to 25 surfaces with an interval time between each two successive surfaces of about 100 msec. The summation time of all surfaces reached about 2400 msec, that was adopted according to West Kifl-1 well, which penetrated to a depth of 6000 m, representing the deepest well in the study area. The seismic lines and well data were converted to build a 3D cube time model and the velocity was spread on the model. The seismic inversion modeling of the elastic properties of the horizon and well data was applied to achieve a corrected veloci
... Show MoreIn this paper, the problem of resource allocation at Al-Raji Company for soft drinks and juices was studied. The company produces several types of tasks to produce juices and soft drinks, which need machines to accomplish these tasks, as it has 6 machines that want to allocate to 4 different tasks to accomplish these tasks. The machines assigned to each task are subject to failure, as these machines are repaired to participate again in the production process. From past records of the company, the probability of failure machines at each task was calculated depending on company data information. Also, the time required for each machine to complete each task was recorded. The aim of this paper is to determine the minimum expected ti
... Show MoreIn this study, multi-objective optimization of nanofluid aluminum oxide in a mixture of water and ethylene glycol (40:60) is studied. In order to reduce viscosity and increase thermal conductivity of nanofluids, NSGA-II algorithm is used to alter the temperature and volume fraction of nanoparticles. Neural network modeling of experimental data is used to obtain the values of viscosity and thermal conductivity on temperature and volume fraction of nanoparticles. In order to evaluate the optimization objective functions, neural network optimization is connected to NSGA-II algorithm and at any time assessment of the fitness function, the neural network model is called. Finally, Pareto Front and the corresponding optimum points are provided and
... Show MoreMedical Ultrasound (US) has many features that make it widely used in the world. These features are safety, availability and low cost. However, despite these features, the ultrasound suffers from problems. These problems are speckle noise and artifacts. In this paper, a new method is proposed to improve US images by removing speckle noise and reducing artifacts to enhance the contrast of the image. The proposed method involves algorithms for image preprocessing and segmentation. A median filter is used to smooth the image in the pre-processing. Additionally, to obtain best results, applying median filter with different kernel values. We take the better output of the median filter and feed it into the Gaussian filter, which then
... Show MoreFree-Space Optical (FSO) can provide high-speed communications when the effect of turbulence is not serious. However, Space-Time-Block-Code (STBC) is a good candidate to mitigate this seriousness. This paper proposes a hybrid of an Optical Code Division Multiple Access (OCDMA) and STBC in FSO communication for last mile solutions, where access to remote areas is complicated. The main weakness effecting a FSO link is the atmospheric turbulence. The feasibility of employing STBC in OCDMA is to mitigate these effects. The current work evaluates the Bit-Error-Rate (BER) performance of OCDMA operating under the scintillation effect, where this effect can be described by the gamma-gamma model. The most obvious finding to emerge from the analysis
... Show MoreSteganography is defined as hiding confidential information in some other chosen media without leaving any clear evidence of changing the media's features. Most traditional hiding methods hide the message directly in the covered media like (text, image, audio, and video). Some hiding techniques leave a negative effect on the cover image, so sometimes the change in the carrier medium can be detected by human and machine. The purpose of suggesting hiding information is to make this change undetectable. The current research focuses on using complex method to prevent the detection of hiding information by human and machine based on spiral search method, the Structural Similarity Index Metrics measures are used to get the accuracy and quality
... Show MoreClassification of imbalanced data is an important issue. Many algorithms have been developed for classification, such as Back Propagation (BP) neural networks, decision tree, Bayesian networks etc., and have been used repeatedly in many fields. These algorithms speak of the problem of imbalanced data, where there are situations that belong to more classes than others. Imbalanced data result in poor performance and bias to a class without other classes. In this paper, we proposed three techniques based on the Over-Sampling (O.S.) technique for processing imbalanced dataset and redistributing it and converting it into balanced dataset. These techniques are (Improved Synthetic Minority Over-Sampling Technique (Improved SMOTE), Border
... Show MoreStereolithography (SLA) has become an essential photocuring 3D printing process for producing parts of complex shapes from photosensitive resin exposed to UV light. The selection of the best printing parameters for good accuracy and surface quality can be further complicated by the geometric complexity of the models. This work introduces multiobjective optimization of SLA printing of 3D dental bridges based on simple CAD objects. The effect of the best combination of a low-cost resin 3D printer’s machine parameter settings, namely normal exposure time, bottom exposure time and bottom layers for less dimensional deviation and surface roughness, was studied. A multiobjective optimization method was utilized, combining the Taguchi me
... Show More