The analysis of the hyperlink structure of the web has led to significant improvements in web information retrieval. This survey study evaluates and analyzes relevant research publications on link analysis in web information retrieval utilizing diverse methods. These factors include the research year, the aims of the research article, the algorithms utilized to complete their study, and the findings received after using the algorithms. The findings revealed that Page Rank, Weighted Page Rank, and Weighted Page Content Rank are extensively employed by academics to properly analyze hyperlinks in web information retrieval. Finally, this paper analyzes the previous studies.
The estimation of the regular regression model requires several assumptions to be satisfied such as "linearity". One problem occurs by partitioning the regression curve into two (or more) parts and then joining them by threshold point(s). This situation is regarded as a linearity violation of regression. Therefore, the multiphase regression model is received increasing attention as an alternative approach which describes the changing of the behavior of the phenomenon through threshold point estimation. Maximum likelihood estimator "MLE" has been used in both model and threshold point estimations. However, MLE is not resistant against violations such as outliers' existence or in case of the heavy-tailed error distribution. The main goal of t
... Show MoreOne of the costliest problems facing the production of hydrocarbons in unconsolidated sandstone reservoirs is the production of sand once hydrocarbon production starts. The sanding start prediction model is very important to decide on sand control in the future, including whether or when sand control should be used. This research developed an easy-to-use Computer program to determine the beginning of sanding sites in the driven area. The model is based on estimating the critical pressure drop that occurs when sand is onset to produced. The outcomes have been drawn as a function of the free sand production with the critical flow rates for reservoir pressure decline. The results show that the pressure drawdown required to
... Show MoreInternet of Things (IoT) contributes to improve the quality of life as it supports many applications, especially healthcare systems. Data generated from IoT devices is sent to the Cloud Computing (CC) for processing and storage, despite the latency caused by the distance. Because of the revolution in IoT devices, data sent to CC has been increasing. As a result, another problem added to the latency was increasing congestion on the cloud network. Fog Computing (FC) was used to solve these problems because of its proximity to IoT devices, while filtering data is sent to the CC. FC is a middle layer located between IoT devices and the CC layer. Due to the massive data generated by IoT devices on FC, Dynamic Weighted Round Robin (DWRR)
... Show MoreThis study focused on spectral clustering (SC) and three-constraint affinity matrix spectral clustering (3CAM-SC) to determine the number of clusters and the membership of the clusters of the COST 2100 channel model (C2CM) multipath dataset simultaneously. Various multipath clustering approaches solve only the number of clusters without taking into consideration the membership of clusters. The problem of giving only the number of clusters is that there is no assurance that the membership of the multipath clusters is accurate even though the number of clusters is correct. SC and 3CAM-SC aimed to solve this problem by determining the membership of the clusters. The cluster and the cluster count were then computed through the cluster-wise J
... Show MoreIn this research, several estimators concerning the estimation are introduced. These estimators are closely related to the hazard function by using one of the nonparametric methods namely the kernel function for censored data type with varying bandwidth and kernel boundary. Two types of bandwidth are used: local bandwidth and global bandwidth. Moreover, four types of boundary kernel are used namely: Rectangle, Epanechnikov, Biquadratic and Triquadratic and the proposed function was employed with all kernel functions. Two different simulation techniques are also used for two experiments to compare these estimators. In most of the cases, the results have proved that the local bandwidth is the best for all the
... Show MoreDue to severe scouring, many bridges failed worldwide. Therefore, the safety of the existing bridge (after contrition) mainly depends on the continuous monitoring of local scour at the substructure. However, the bridge's safety before construction mainly depends on the consideration of local scour estimation at the bridge substructure. Estimating the local scour at the bridge piers is usually done using the available formulae. Almost all the formulae used in estimating local scour at the bridge piers were derived from laboratory data. It is essential to test the performance of proposed local scour formulae using field data. In this study, the performance of selected bridge scours estimation formulae was validated and sta
... Show MoreData hiding is the process of encoding extra information in an image by making small modification to its pixels. To be practical, the hidden data must be perceptually invisible yet robust to common signal processing operations. This paper introduces a scheme for hiding a signature image that could be as much as 25% of the host image data and hence could be used both in digital watermarking as well as image/data hiding. The proposed algorithm uses orthogonal discrete wavelet transforms with two zero moments and with improved time localization called discrete slantlet transform for both host and signature image. A scaling factor ? in frequency domain control the quality of the watermarked images. Experimental results of signature image
... Show MoreSteganography is a technique to hide a secret message within a different multimedia carrier so that the secret message cannot be identified. The goals of steganography techniques include improvements in imperceptibility, information hiding, capacity, security, and robustness. In spite of numerous secure methodologies that have been introduced, there are ongoing attempts to develop these techniques to make them more secure and robust. This paper introduces a color image steganographic method based on a secret map, namely 3-D cat. The proposed method aims to embed data using a secure structure of chaotic steganography, ensuring better security. Rather than using the complete image for data hiding, the selection of
... Show MoreThe research aims to estimate missing values using covariance analysis method Coons way to the variable response or dependent variable that represents the main character studied in a type of multi-factor designs experiments called split block-design (SBED) so as to increase the accuracy of the analysis results and the accuracy of statistical tests based on this type of designs. as it was noted in the theoretical aspect to the design of dissident sectors and statistical analysis have to analyze the variation in the experience of experiment )SBED) and the use of covariance way coons analysis according to two methods to estimate the missing value, either in the practical side of it has been implemented field experiment wheat crop in
... Show MoreProblems in the Translation of Spanish phraseology to Arabic in the Literary Text (A Comparative Study from the Perspective translatological)
Abstract
One of the most common problems facing the translator is the identification and subsequent search for correspondences of phraseological units. The importance of the phraseological competence in a foreign language is widely recognized by many authors (Howarth, Corpas Pastor, Pamies Bertran, to name a few).
We must lose our fear to recognize that the domain of the phraseology is the highest level of command of any language. The objective of the present study is to clarify the differences in UFS Spanish to Arabi
... Show More