The analysis of the hyperlink structure of the web has led to significant improvements in web information retrieval. This survey study evaluates and analyzes relevant research publications on link analysis in web information retrieval utilizing diverse methods. These factors include the research year, the aims of the research article, the algorithms utilized to complete their study, and the findings received after using the algorithms. The findings revealed that Page Rank, Weighted Page Rank, and Weighted Page Content Rank are extensively employed by academics to properly analyze hyperlinks in web information retrieval. Finally, this paper analyzes the previous studies.
The present study is an attempt to throw light on the nature of the US policy regarding the Middle East region as portrayed by AI-Sabah, Al-Mashriq and Tariq Al-Shaab papers over a period of three months from 1st of July to 30th of September 2013.
In writing this study, a number of goals have been set by the researcher. These goals may include but in no way limited to the nature of the US image as carried by the above three papers, the nature of the topics tackled by them and the nature of the Arab countries which received more and extensive coverage than others.
A qualitative research approach is proposed for the study. This approach has allowed the researcher to arrive at definite answers for the possible questions rais
... Show MoreThis paper presents a novel idea as it investigates the rescue effect of the prey with fluctuation effect for the first time to propose a modified predator-prey model that forms a non-autonomous model. However, the approximation method is utilized to convert the non-autonomous model to an autonomous one by simplifying the mathematical analysis and following the dynamical behaviors. Some theoretical properties of the proposed autonomous model like the boundedness, stability, and Kolmogorov conditions are studied. This paper's analytical results demonstrate that the dynamic behaviors are globally stable and that the rescue effect improves the likelihood of coexistence compared to when there is no rescue impact. Furthermore, numerical simul
... Show MoreString matching is seen as one of the essential problems in computer science. A variety of computer applications provide the string matching service for their end users. The remarkable boost in the number of data that is created and kept by modern computational devices influences researchers to obtain even more powerful methods for coping with this problem. In this research, the Quick Search string matching algorithm are adopted to be implemented under the multi-core environment using OpenMP directive which can be employed to reduce the overall execution time of the program. English text, Proteins and DNA data types are utilized to examine the effect of parallelization and implementation of Quick Search string matching algorithm on multi-co
... Show MoreThe widespread of internet allover the world, in addition to the increasing of the huge number of users that they exchanged important information over it highlights the need for a new methods to protect these important information from intruders' corruption or modification. This paper suggests a new method that ensures that the texts of a given document cannot be modified by the intruders. This method mainly consists of mixture of three steps. The first step which barrows some concepts of "Quran" security system to detect some type of change(s) occur in a given text. Where a key of each paragraph in the text is extracted from a group of letters in that paragraph which occur as multiply of a given prime number. This step cannot detect the ch
... Show MoreA confluence of forces has brought journalism and journalism education to a precipice. The rise of fascism, the advance of digital technology, and the erosion of the economic foundation of news media are disrupting journalism and mass communication (JMC) around the world. Combined with the increasingly globalized nature of journalism and media, these forces are posing extraordinary challenges to and opportunities for journalism and media education. This essay outlines 10 core principles to guide and reinvigorate international JMC education. We offer a concluding principle for JMC education as a foundation for the general education of college students.
conventional FCM algorithm does not fully utilize the spatial information in the image. In this research, we use a FCM algorithm that incorporates spatial information into the membership function for clustering. The spatial function is the summation of the membership functions in the neighborhood of each pixel under consideration. The advantages of the method are that it is less
sensitive to noise than other techniques, and it yields regions more homogeneous than those of other methods. This technique is a powerful method for noisy image segmentation.
In information security, fingerprint verification is one of the most common recent approaches for verifying human identity through a distinctive pattern. The verification process works by comparing a pair of fingerprint templates and identifying the similarity/matching among them. Several research studies have utilized different techniques for the matching process such as fuzzy vault and image filtering approaches. Yet, these approaches are still suffering from the imprecise articulation of the biometrics’ interesting patterns. The emergence of deep learning architectures such as the Convolutional Neural Network (CNN) has been extensively used for image processing and object detection tasks and showed an outstanding performance compare
... Show MoreSteganography is defined as hiding confidential information in some other chosen media without leaving any clear evidence of changing the media's features. Most traditional hiding methods hide the message directly in the covered media like (text, image, audio, and video). Some hiding techniques leave a negative effect on the cover image, so sometimes the change in the carrier medium can be detected by human and machine. The purpose of suggesting hiding information is to make this change undetectable. The current research focuses on using complex method to prevent the detection of hiding information by human and machine based on spiral search method, the Structural Similarity Index Metrics measures are used to get the accuracy and quality
... Show MoreThis paper is devoted to compare the performance of non-Bayesian estimators represented by the Maximum likelihood estimator of the scale parameter and reliability function of inverse Rayleigh distribution with Bayesian estimators obtained under two types of loss function specifically; the linear, exponential (LINEX) loss function and Entropy loss function, taking into consideration the informative and non-informative priors. The performance of such estimators assessed on the basis of mean square error (MSE) criterion. The Monte Carlo simulation experiments are conducted in order to obtain the required results.