The amount of information on the web is growing rapidly and number of web sites
becomes huge, so set of web applications used to help user and give him some information
about these sites, especially in e- business, news and sites introduce services.
Since Web site is built by comparatively free description, it is difficult to perform
absolute evaluation. So, rating of Web site is performed from various viewpoints.
In this paper we proposed method for ranking and rating using the web service and java
script to motivation remote server and return some public information to the site user and
other private information to owner.
Web application protection lies on two levels: the first is the responsibility of the server management, and the second is the responsibility of the programmer of the site (this is the scope of the research). This research suggests developing a secure web application site based on three-tier architecture (client, server, and database). The security of this system described as follows: using multilevel access by authorization, which means allowing access to pages depending on authorized level; password encrypted using Message Digest Five (MD5) and salt. Secure Socket Layer (SSL) protocol authentication used. Writing PHP code according to set of rules to hide source code to ensure that it cannot be stolen, verification of input before it is s
... Show MoreWeb application protection lies on two levels: the first is the responsibility of the server management, and the second is the responsibility of the programmer of the site (this is the scope of the research). This research suggests developing a secure web application site based on three-tier architecture (client, server, and database). The security of this system described as follows: using multilevel access by authorization, which means allowing access to pages depending on authorized level; password encrypted using Message Digest Five (MD5) and salt. Secure Socket Layer (SSL) protocol authentication used. Writing PHP code according to set of rules to hide source code to ensur
... Show MoreThe analysis of the hyperlink structure of the web has led to significant improvements in web information retrieval. This survey study evaluates and analyzes relevant research publications on link analysis in web information retrieval utilizing diverse methods. These factors include the research year, the aims of the research article, the algorithms utilized to complete their study, and the findings received after using the algorithms. The findings revealed that Page Rank, Weighted Page Rank, and Weighted Page Content Rank are extensively employed by academics to properly analyze hyperlinks in web information retrieval. Finally, this paper analyzes the previous studies.
The last two decades have seen a marked increase in the illegal activities on the Dark Web. Prompt evolvement and use of sophisticated protocols make it difficult for security agencies to identify and investigate these activities by conventional methods. Moreover, tracing criminals and terrorists poses a great challenge keeping in mind that cybercrimes are no less serious than real life crimes. At the same time, computer security societies and law enforcement pay a great deal of attention on detecting and monitoring illegal sites on the Dark Web. Retrieval of relevant information is not an easy task because of vastness and ever-changing nature of the Dark Web; as a result, web crawlers play a vital role in achieving this task. The
... Show MoreIf we go beyond the technical aspects of the Web 2.0, and we focus specifically on its interactive characteristics, we may say it represents not only a fundamental shift in the structure of the press institutions and its practices but also a shift in the relationships that existed, previously, between the press and the audience. Web 2.0 has enabled the newspapers to renovate their representations and practices of the profession and opens to the new horizons either in terms of readership or advertising revenues. Parallel to that it also has empowered the user to transcend the passivity he has always been confined in and has become a more active participant in the creation and generation of media contents even though this practice is somew
... Show MoreTI1e Web service securi ty challenge is to understand and assess the risk involved in securing a web-based service today, based on our existing security technology, and at the same time tmck emerging standards and understand how they will be used to offset the risk in
new web services. Any security model must i llustrate how data can
now through an application and network topology to meet the
requirements defined by the busi ness wi thout exposing the data to undue risk. In this paper we propose &n
... Show MoreWeb testing is very important method for users and developers because it gives the ability to detect errors in applications and check their quality to perform services to users performance abilities, user interface, security and other different types of web testing that may occur in web application. This paper focuses on a major branch of the performance testing, which is called the load testing. Load testing depends on an important elements called request time and response time. From these elements, it can be decided if the performance time of a web application is good or not. In the experimental results, the load testing applied on the website (http://ihcoedu.uobaghdad.edu.iq) the main home page and all the science departments pages. In t
... Show MoreBackground: Data on SARS-CoV-2 from developing countries is not entirely accurate, demanding incorporating digital epidemiology data on the pandemic.
Objectives: To reconcile non-Bayesian models and artificial intelligence connected with digital and classical (non-digital) epidemiological data on SARS-CoV-2 pandemic in Iraq.
Results: Baghdad and Sulaymaniyah represented statistical outliers in connection with daily cases and recoveries, and daily deaths, respectively. Multivariate tests and neural networks detected a predictor effect of deaths, recoveries, and daily cases on web searches concerning two search terms, "كورونا" and "Coronavirus" (Pillai's Trace val