Web application protection lies on two levels: the first is the responsibility of the server management, and the second is the responsibility of the programmer of the site (this is the scope of the research). This research suggests developing a secure web application site based on three-tier architecture (client, server, and database). The security of this system described as follows: using multilevel access by authorization, which means allowing access to pages depending on authorized level; password encrypted using Message Digest Five (MD5) and salt. Secure Socket Layer (SSL) protocol authentication used. Writing PHP code according to set of rules to hide source code to ensure that it cannot be stolen, verification of input before it is sent to database, and update scripts periodically to close gaps in the site. Using 2Checkout company (2CO), which is trusted international electronic money transfer to allow customers pay money in a secure manner.
In this paper has been one study of autoregressive generalized conditional heteroscedasticity models existence of the seasonal component, for the purpose applied to the daily financial data at high frequency is characterized by Heteroscedasticity seasonal conditional, it has been depending on Multiplicative seasonal Generalized Autoregressive Conditional Heteroscedastic Models Which is symbolized by the Acronym (SGARCH) , which has proven effective expression of seasonal phenomenon as opposed to the usual GARCH models. The summarizing of the research work studying the daily data for the price of the dinar exchange rate against the dollar, has been used autocorrelation function to detect seasonal first, then was diagnosed wi
... Show MoreMany production companies suffers from big losses because of high production cost and low profits for several reasons, including raw materials high prices and no taxes impose on imported goods also consumer protection law deactivation and national product and customs law, so most of consumers buy imported goods because it is characterized by modern specifications and low prices.
The production company also suffers from uncertainty in the cost, volume of production, sales, and availability of raw materials and workers number because they vary according to the seasons of the year.
I had adopted in this research fuzzy linear program model with fuzzy figures
... Show MoreIn this paper, the maximum likelihood estimates for parameter ( ) of two parameter's Weibull are studied, as well as white estimators and (Bain & Antle) estimators, also Bayes estimator for scale parameter ( ), the simulation procedures are used to find the estimators and comparing between them using MSE. Also the application is done on the data for 20 patients suffering from a headache disease.
Intended for getting good estimates with more accurate results, we must choose the appropriate method of estimation. Most of the equations in classical methods are linear equations and finding analytical solutions to such equations is very difficult. Some estimators are inefficient because of problems in solving these equations. In this paper, we will estimate the survival function of censored data by using one of the most important artificial intelligence algorithms that is called the genetic algorithm to get optimal estimates for parameters Weibull distribution with two parameters. This leads to optimal estimates of the survival function. The genetic algorithm is employed in the method of moment, the least squares method and the weighted
... Show MoreAccording to the European Union Water Framework Directive requirements, diatom metrics were used to assess the ecological status of surface waters in the Gaziantep central catchment (Turkey). A total of 42 diatom taxa were identified. A few environmental factors (especially lead, copper, orthophosphate, and chromium) played significant roles on the distribution of diatom assemblages among the sampling stations. The first two axes of the canonical correspondence analysis elucidated 91.6 % of the species–environment correlations with 13.9 % of the cumulative variance of species. The applied diatom indices (TIT – Trophic Index Turkey, TI – Trophic Index, and EPI-D – Eutrophication and/or Pollution Index-Diatom) showed different results
... Show More This study is a try to compare between the traditional Schwarzschild’s radius and the equation of Schwarzschild’s radius including the photon’s wavelength that is suggested by Kanarev for black holes to correct the error in the calculation of the gravitational radius where the wavelengths of the electromagnetic radiation will be in our calculation. By using the different wavelengths; from radio waves to gamma ray for arbitrary black holes (ordinary and supermassive).
This research deals with a shrinking method concernes with the principal components similar to that one which used in the multiple regression “Least Absolute Shrinkage and Selection: LASS”. The goal here is to make an uncorrelated linear combinations from only a subset of explanatory variables that may have a multicollinearity problem instead taking the whole number say, (K) of them. This shrinkage will force some coefficients to equal zero, after making some restriction on them by some "tuning parameter" say, (t) which balances the bias and variance amount from side, and doesn't exceed the acceptable percent explained variance of these components. This had been shown by MSE criterion in the regression case and the percent explained v
... Show MoreThe purpose of this research is to determine the extent to which independent auditors can audit the requirements of e-commerce related to (infrastructure requirements, legislation and regulations, tax laws, and finally human cadres). To achieve this, a questionnaire was designed for auditors. Numerous statistical methods, namely arithmetic mean and standard deviation, have been used through the implementation of the Statistical Packages for Social Sciences (SPSS) program.
The research has reached several results, the most important of which are: There are noobstacles to enabling the auditor to audit the application of the e-commerce requirements as well as the respective(infrastructure requirements, legislation and regulations, t
... Show More
This paper presents a grey model GM(1,1) of the first rank and a variable one and is the basis of the grey system theory , This research dealt properties of grey model and a set of methods to estimate parameters of the grey model GM(1,1) is the least square Method (LS) , weighted least square method (WLS), total least square method (TLS) and gradient descent method (DS). These methods were compared based on two types of standards: Mean square error (MSE), mean absolute percentage error (MAPE), and after comparison using simulation the best method was applied to real data represented by the rate of consumption of the two types of oils a Heavy fuel (HFO) and diesel fuel (D.O) and has been applied several tests to
... Show More