The North Central Coast of Vietnam has a wide distribution of loose sand which is often exposed on the surface. The thickness changes from a few meters to over ten meters. This sand with the loose state can be sensitive to the dynamic loads, such as earthquakes, traffic load, or machine foundations. It can be liquefied under these loadings, which might destroy the ground and buildings. The Standard Penetration Test (SPT) is widely used in engineering practice and its values can be useful for the assessment of soil liquefaction potential. Thus, this article presents some ground profiles in some sites in the North Central Coast of Vietnam and determines the liquefaction potential of sand based on SPT and using three parameters, including the Factor of Safety against Liquefaction (FSLIQ), Liquefaction Potential Index (LPI), and Liquefaction Severity Number (LSN). The research results show that the FSLIQ, LPI, and LSN values depend on the depth of sand samples and the SPT values. In this study, the sand distributed from 2.0 to 18.0m with (N1)60cs value of less than 20 has high liquefaction potential with FSLIQ<1, LPI is often higher than 0.73, and LSN is often higher than 10. The results also show that many soil profiles have high liquefaction potential. These results should be considered for construction activities in this area.
Iris research is focused on developing techniques for identifying and locating relevant biometric features, accurate segmentation and efficient computation while lending themselves to compression methods. Most iris segmentation methods are based on complex modelling of traits and characteristics which, in turn, reduce the effectiveness of the system being used as a real time system. This paper introduces a novel parameterized technique for iris segmentation. The method is based on a number of steps starting from converting grayscale eye image to a bit plane representation, selection of the most significant bit planes followed by a parameterization of the iris location resulting in an accurate segmentation of the iris from the origin
... Show MoreThe quality of Global Navigation Satellite Systems (GNSS) networks are considerably influenced by the configuration of the observed baselines. Where, this study aims to find an optimal configuration for GNSS baselines in terms of the number and distribution of baselines to improve the quality criteria of the GNSS networks. First order design problem (FOD) was applied in this research to optimize GNSS network baselines configuration, and based on sequential adjustment method to solve its objective functions.
FOD for optimum precision (FOD-p) was the proposed model which based on the design criteria of A-optimality and E-optimality. These design criteria were selected as objective functions of precision, whic
... Show MoreAuthentication is the process of determining whether someone or something is,
in fact, who or what it is declared to be. As the dependence upon computers and
computer networks grows, the need for user authentication has increased. User’s
claimed identity can be verified by one of several methods. One of the most popular
of these methods is represented by (something user know), such as password or
Personal Identification Number (PIN). Biometrics is the science and technology of
authentication by identifying the living individual’s physiological or behavioral
attributes. Keystroke authentication is a new behavioral access control system to
identify legitimate users via their typing behavior. The objective of thi
A session is a period of time linked to a user, which is initiated when he/she arrives at a web application and it ends when his/her browser is closed or after a certain time of inactivity. Attackers can hijack a user's session by exploiting session management vulnerabilities by means of session fixation and cross-site request forgery attacks.
Very often, session IDs are not only identification tokens, but also authenticators. This means that upon login, users are authenticated based on their credentials (e.g., usernames/passwords or digital certificates) and issued session IDs that will effectively serve as temporary static passwords for accessing their sessions. This makes session IDs a very appealing target for attackers. In many c
The corrosion of metals is of great economic importance. Estimates show that the quarter of the iron and the steel produced is destroyed in this way. Rubber lining has been used for severe corrosion protection because NR and certain synthetic rubbers have a basic resistance to the very corrosive chemicals particularly acids. The present work includes producing ebonite from both natural and synthetic rubbers ; therefore, the following materials were chosen to produce ebonite rubber: a) Natural rubber (NR). b) Styrene butadiene rubber (SBR). c) Nitrile rubber (NBR). d) Neoprene rubber (CR) [WRT]. The best ebonite vulcanizates are obtained in the presence of 30 Pphr sulfur, and carbon black as reinforcing filler. The relation between
... Show MoreMerging images is one of the most important technologies in remote sensing applications and geographic information systems. In this study, a simulation process using a camera for fused images by using resizing image for interpolation methods (nearest, bilinear and bicubic). Statistical techniques have been used as an efficient merging technique in the images integration process employing different models namely Local Mean Matching (LMM) and Regression Variable Substitution (RVS), and apply spatial frequency techniques include high pass filter additive method (HPFA). Thus, in the current research, statistical measures have been used to check the quality of the merged images. This has been carried out by calculating the correlation a
... Show MoreThe transition of customers from one telecom operator to another has a direct impact on the company's growth and revenue. Traditional classification algorithms fail to predict churn effectively. This research introduces a deep learning model for predicting customers planning to leave to another operator. The model works on a high-dimensional large-scale data set. The performance of the model was measured against other classification algorithms, such as Gaussian NB, Random Forrest, and Decision Tree in predicting churn. The evaluation was performed based on accuracy, precision, recall, F-measure, Area Under Curve (AUC), and Receiver Operating Characteristic (ROC) Curve. The proposed deep learning model performs better than othe
... Show MorePredicting the network traffic of web pages is one of the areas that has increased focus in recent years. Modeling traffic helps find strategies for distributing network loads, identifying user behaviors and malicious traffic, and predicting future trends. Many statistical and intelligent methods have been studied to predict web traffic using time series of network traffic. In this paper, the use of machine learning algorithms to model Wikipedia traffic using Google's time series dataset is studied. Two data sets were used for time series, data generalization, building a set of machine learning models (XGboost, Logistic Regression, Linear Regression, and Random Forest), and comparing the performance of the models using (SMAPE) and
... Show MoreSpeech is the essential way to interact between humans or between human and machine. However, it is always contaminated with different types of environment noise. Therefore, speech enhancement algorithms (SEA) have appeared as a significant approach in speech processing filed to suppress background noise and return back the original speech signal. In this paper, a new efficient two-stage SEA with low distortion is proposed based on minimum mean square error sense. The estimation of clean signal is performed by taking the advantages of Laplacian speech and noise modeling based on orthogonal transform (Discrete Krawtchouk-Tchebichef transform) coefficients distribution. The Discrete Kra