The hydraulic conditions of a flow previously proved to be changed when placing large-scale geometric roughness elements on the bed of an open channel. These elements impose more resistance to the flow. The geometry of the roughness elements, the numbers used, and the configuration are parameters that can affect the hydraulic flow characteristics. The target is to use inclined block elements to control the salt wedge propagation pointed in most estuaries to prevent its negative effects. The Computational Fluid Dynamics CFD Software was used to simulate the two-phase flow in an estuary model. In this model, the used block elements are 2 cm by 3 cm cross-sections with an inclined face in the flow direction, with a length of their sides 2 and 3 cm. These elements were placed with a constant spacing in two rows at a distance from two sides of the bed of the channel model. Six simulation runs were conducted with two different discharges and three different inclinations of the centerline of the element concerning the flow direction. The applied discharges are 30 and 45.3 l/min, and the inclination of roughness elements are 15o, 30o, and 45o. The spacing between elements in each row is kept at 3cm. The results showed that when no roughness elements were used, the propagation of the salt wedge extended to 3.9m and 3.1m at a discharge of 30 l/min and 45.31/min, respectively. The propagation of the salt wedge was reduced when using the inclined blocks roughness element. This reduction depends on the applied discharge and the angle of inclination. At the minimum applied discharge of 30 l/min, the propagation of the salt wedge was reduced by 74% at 45o inclination. In contrast, it was 69% at 30o and 64% at 15o inclination at the same discharge. When the discharge is 45.3 l/min, the propagation of the salt wedge was reduced by 85% at 45o inclinations of roughness, 84% at 30o. It was 70% at 15o inclinations. The roughness elements improve the flow turbulence that disperses and slows the salt wedge propagation beneath the fresh water.
Malicious software (malware) performs a malicious function that compromising a computer system’s security. Many methods have been developed to improve the security of the computer system resources, among them the use of firewall, encryption, and Intrusion Detection System (IDS). IDS can detect newly unrecognized attack attempt and raising an early alarm to inform the system about this suspicious intrusion attempt. This paper proposed a hybrid IDS for detection intrusion, especially malware, with considering network packet and host features. The hybrid IDS designed using Data Mining (DM) classification methods that for its ability to detect new, previously unseen intrusions accurately and automatically. It uses both anomaly and misuse dete
... Show MoreBoth the double-differenced and zero-differenced GNSS positioning strategies have been widely used by the geodesists for different geodetic applications which are demanded for reliable and precise positions. A closer inspection of the requirements of these two GNSS positioning techniques, the zero-differenced positioning, which is known as Precise Point Positioning (PPP), has gained a special importance due to three main reasons. Firstly, the effective applications of PPP for geodetic purposes and precise applications depend entirely on the availability of the precise satellite products which consist of precise satellite orbital elements, precise satellite clock corrections, and Earth orientation parameters. Secondly, th
... Show MoreAnomaly detection is still a difficult task. To address this problem, we propose to strengthen DBSCAN algorithm for the data by converting all data to the graph concept frame (CFG). As is well known that the work DBSCAN method used to compile the data set belong to the same species in a while it will be considered in the external behavior of the cluster as a noise or anomalies. It can detect anomalies by DBSCAN algorithm can detect abnormal points that are far from certain set threshold (extremism). However, the abnormalities are not those cases, abnormal and unusual or far from a specific group, There is a type of data that is do not happen repeatedly, but are considered abnormal for the group of known. The analysis showed DBSCAN using the
... Show MoreProsthetic is an artificial tool that replaces a member of the human frame that is absent because of ailment, damage, or distortion. The current research activities in Iraq draw interest to the upper limb discipline because of the growth in the number of amputees. Thus, it becomes necessary to increase researches in this subject to help in reducing the struggling patients. This paper describes the design and development of a prosthesis for people able and wear them from persons who have amputation in the hands. This design is composed of a hand with five fingers moving by means of a gearbox ism mechanism. The design of this artificial hand has 5 degrees of freedom. This artificial hand works based on the principle of &n
... Show MoreImage compression plays an important role in reducing the size and storage of data while increasing the speed of its transmission through the Internet significantly. Image compression is an important research topic for several decades and recently, with the great successes achieved by deep learning in many areas of image processing, especially image compression, and its use is increasing Gradually in the field of image compression. The deep learning neural network has also achieved great success in the field of processing and compressing various images of different sizes. In this paper, we present a structure for image compression based on the use of a Convolutional AutoEncoder (CAE) for deep learning, inspired by the diversity of human eye
... Show MoreVariable selection is an essential and necessary task in the statistical modeling field. Several studies have triedto develop and standardize the process of variable selection, but it isdifficultto do so. The first question a researcher needs to ask himself/herself what are the most significant variables that should be used to describe a given dataset’s response. In thispaper, a new method for variable selection using Gibbs sampler techniqueshas beendeveloped.First, the model is defined, and the posterior distributions for all the parameters are derived.The new variable selection methodis tested usingfour simulation datasets. The new approachiscompared with some existingtechniques: Ordinary Least Squared (OLS), Least Absolute Shrinkage
... Show MoreDatabase is characterized as an arrangement of data that is sorted out and disseminated in a way that allows the client to get to the data being put away in a simple and more helpful way. However, in the era of big-data the traditional methods of data analytics may not be able to manage and process the large amount of data. In order to develop an efficient way of handling big-data, this work studies the use of Map-Reduce technique to handle big-data distributed on the cloud. This approach was evaluated using Hadoop server and applied on EEG Big-data as a case study. The proposed approach showed clear enhancement for managing and processing the EEG Big-data with average of 50% reduction on response time. The obtained results provide EEG r
... Show MoreOptical fiber chemical sensor based surface Plasmon resonance for sensing and measuring the refractive index and concentration for Acetic acid is designed and implemented during this work. Optical grade plastic optical fibers with a diameter of 1000μm were used with a diameter core of 980μm and a cladding of 20μm, where the sensor is fabricated by a small part (10mm) of optical fiber in the middle is embedded in a resin block and then the polishing process is done, after that it is deposited with about (40nm) thickness of gold metal and the Acetic acid is placed on the sensing probe.