In this paper, the propose is to use the xtreme value distribution as the rate of occurrence of the non-homogenous Poisson process, in order to improve the rate of occurrence of the non-homogenous process, which has been called the Extreme value Process. To estimate the parameters of this process, it is proposed to use the Maximum Likelihood method, Method of Moment and a smart method represented by the Artificial Bee Colony:(ABC) algorithm to reach an estimator for this process which represents the best data representation. The results of the three methods are compared through a simulation of the model, and it is concluded that the estimator of (ABC) is better than the estimator of the maximum likelihood method and method of moment in estimating the time rate of occurrence of the proposed Extreme value process. The research also includes a realistic application that deals with the operating periods of two successive stops for the raw materials factory from the General Company for Northern Cement / Badush Cement Factories (new) during the period from 1/4/2018 to 31/1/2019, in order to reach the time rate of factory stops.
Voucher documents have become a very important information carrier in daily lives to be used in many applications. A certain class of people could exploit the trust and indulge in forging or tampering for short or long term benefits unlawfully. This holds a serious threat to the economics and the system of a nation. The aim of this paper is to recognize original voucher document through its contents. Forgery of voucher document could have serious repercussions including financial losses, so the signature, logo and stamp that are used to determine being a genuine or not by using multilevel texture analysis. The proposed method consists of several operations. First, detection and extraction of signature, logo and stamp images from original
... Show MoreThe effect of thickness variation on some physical properties of hematite α-Fe2O3 thin films was investigated. An Fe2O3 bulk in the form of pellet was prepared by cold pressing of Fe2O3 powder with subsequent sintering at 800 . Thin films with various thicknesses were obtained on glass substrates by pulsed laser deposition technique. The films properties were characterized by XRD, and FT-IR. The deposited iron oxide thin films showed a single hematite phase with polycrystalline rhombohedral crystal structure .The thickness of films were estimated by using spectrometer to be (185-232) nm. Using Debye Scherrerś formula, the average grain size for the samples was found to be (18-32) nm. Atomic force microscopy indicated that the films had
... Show MoreIn this paper, preliminary test Shrinkage estimator have been considered for estimating the shape parameter α of pareto distribution when the scale parameter equal to the smallest loss and when a prior estimate α0 of α is available as initial value from the past experiences or from quaintance cases. The proposed estimator is shown to have a smaller mean squared error in a region around α0 when comparison with usual and existing estimators.
Cloud Computing is a mass platform to serve high volume data from multi-devices and numerous technologies. Cloud tenants have a high demand to access their data faster without any disruptions. Therefore, cloud providers are struggling to ensure every individual data is secured and always accessible. Hence, an appropriate replication strategy capable of selecting essential data is required in cloud replication environments as the solution. This paper proposed a Crucial File Selection Strategy (CFSS) to address poor response time in a cloud replication environment. A cloud simulator called CloudSim is used to conduct the necessary experiments, and results are presented to evidence the enhancement on replication performance. The obtained an
... Show MoreThe aim of this paper, is to discuss several high performance training algorithms fall into two main categories. The first category uses heuristic techniques, which were developed from an analysis of the performance of the standard gradient descent algorithm. The second category of fast algorithms uses standard numerical optimization techniques such as: quasi-Newton . Other aim is to solve the drawbacks related with these training algorithms and propose an efficient training algorithm for FFNN
Iraq is one of the Arabian area countries, which considered from the drier areas
on the earth, though it has two main rivers that pass through(Tigris and Euphrates);
it suffers the same problem as them (drought), only the rivers' nearby regions make
use of their water for (domestic, agricultural, and industrial purposes(.
One of the usable solutions is to utilize the groundwater (especially in the desert
regions). Using the Remote Sensing and geographic information system is a rapid
and coast effective techniques, they provide information of large and inaccessible
area within short span for assessing, monitoring, and management of groundwater
resources. In this study, an adaptive algorithm based on Canny edge dete
As smartphones incorporate location data, there is a growing concern about location privacy as smartphone technologies advance. Using a remote server, the mobile applications are able to capture the current location coordinates at any time and store them. The client awards authorization to an outsider. The outsider can gain admittance to area information on the worker by JSON Web Token (JWT). Protection is giving cover to clients, access control, and secure information stockpiling. Encryption guarantees the security of the location area on the remote server using the Rivest Shamir Adleman (RSA) algorithm. This paper introduced two utilizations of cell phones (tokens, and location). The principal application can give area inf
... Show MoreEnhancing quality image fusion was proposed using new algorithms in auto-focus image fusion. The first algorithm is based on determining the standard deviation to combine two images. The second algorithm concentrates on the contrast at edge points and correlation method as the criteria parameter for the resulted image quality. This algorithm considers three blocks with different sizes at the homogenous region and moves it 10 pixels within the same homogenous region. These blocks examine the statistical properties of the block and decide automatically the next step. The resulted combined image is better in the contras
... Show MoreThis work presents the simulation of a Low density Parity Check (LDPC) coding scheme with
multiuserMulti-Carrier Code Division Multiple Access (MC-CDMA) system over Additive White
Gaussian Noise (AWGN) channel and multipath fading channels. The decoding technique used in
the simulation was iterative decoding since it gives maximum efficiency with ten iterations.
Modulation schemes that used are Phase Shift Keying (BPSK, QPSK and 16 PSK), along with the
Orthogonal Frequency Division Multiplexing (OFDM). A 12 pilot carrier were used in the estimator
to compensate channel effect. The channel model used is Long Term Evolution (LTE) channel with
Technical Specification TS 25.101v2.10 and 5 MHz bandwidth including the chan
In this paper, a national grid-connected photovoltaic (PV) system is proposed. It extracts the maximum power point (MPP) using three-incremental-steps perturb and observe (TISP&O) maximum power point tracking (MPPT) method. It improves the classic P&O by using three incremental duty ratio (ΔD) instead of a single one in the conventional P and O MPPT method. Therefore, the system's performance is improved to a higher speed and less power fluctuation around the MPP. The Boost converter controls the MPPT and then is connected to a three-phase voltage source inverter (VSI). This type of inverter needs a high and constant input voltage. A second-order low pass (LC) filter is connected to the output of VSI to reduce t
... Show More