With the freedom offered by the Deep Web, people have the opportunity to express themselves freely and discretely, and sadly, this is one of the reasons why people carry out illicit activities there. In this work, a novel dataset for Dark Web active domains known as crawler-DB is presented. To build the crawler-DB, the Onion Routing Network (Tor) was sampled, and then a web crawler capable of crawling into links was built. The link addresses that are gathered by the crawler are then classified automatically into five classes. The algorithm built in this study demonstrated good performance as it achieved an accuracy of 85%. A popular text representation method was used with the proposed crawler-DB crossed by two different supervise
... Show MoreIn this work, satellite images for Razaza Lake and the surrounding area
district in Karbala province are classified for years 1990,1999 and
2014 using two software programming (MATLAB 7.12 and ERDAS
imagine 2014). Proposed unsupervised and supervised method of
classification using MATLAB software have been used; these are
mean value and Singular Value Decomposition respectively. While
unsupervised (K-Means) and supervised (Maximum likelihood
Classifier) method are utilized using ERDAS imagine, in order to get
most accurate results and then compare these results of each method
and calculate the changes that taken place in years 1999 and 2014;
comparing with 1990. The results from classification indicated that
The analysis and efficiency of phenol extraction from the industrial water using different solvents, were investigated. To our knowledge, the experimental information available in the literature for liquid-liquid equilibria of ternary mixtures containing the pair phenol-water is limited. Therefore the purpose of the present investigation is to generate the data for the water-phenol with different solvents to aid the correlation of liquid-liquid equilibria, including phase diagrams, distribution coefficients of phenol, tie-lines data and selectivity of the solvents for the aqueous phenol system.
The ternary equilibrium diagrams and tie-lines
... Show MoreMembrane manufacturing system was operated using dry/wet phase inversion process. A sample of hollow fiber membrane was prepared using (17% wt PVC) polyvinyl chloride as membrane material and N, N Dimethylacetamide (DMAC) as solvent in the first run and the second run was made using (DMAC/Acetone) of ratio 3.4 w/w. Scanning electron microscope (SEM) was used to predict the structure and dimensions of hollow fiber membranes prepared. The ultrafiltration experiments were performed using soluble polymeric solute poly ethylene glycol (PEG) of molecular weight (20000 Dalton) 800 ppm solution 25 °C temperature and 1 bar pressure. The experimental results show that pure water permeation increased from 25.7 to 32.2 (L/m2.h.bar) by adding aceton
... Show MoreMicrobial desalination cell (MDC) has been created for expelling water saltiness, power generation, and wastewater administration. The MDC comprised of three chambers (anode, center desalination, and cathode).Were tested ability of type locally isolated bacteria Bacillus spp.in produce electricity to water desalination. In recent study results showed that a remove where the salinity recorded 4000 ppm at room temperature at the voltages of 0.6 volts and less salinity at room temperature at 0.2 volts was 200 ppm. Recent results highlight the need to reduce time for reduce salinity decreased from 3500 ppm to 500 ppm the eleventh day at a voltage of 0.5 volts that depended on type of substrate.
Compaction curves are widely used in civil engineering especially for road constructions, embankments, etc. Obtaining the precise amount of Optimum Moisture Content (OMC) that gives the Maximum Dry Unit weight gdmax. is very important, where the desired soil strength can be achieved in addition to economic aspects.
In this paper, three peak functions were used to obtain the OMC and gdmax. through curve fitting for the values obtained from Standard Proctor Test. Another surface fitting was also used to model the Ohio’s compaction curves that represent the very large variation of compacted soil types.
The results showed very good correlation between the values obtained from some publ
... Show MoreIn the field of construction project management, time and cost are the most important factors to be considered in planning every project, and their relationship is complex. The total cost for each project is the sum of the direct and indirect cost. Direct cost commonly represents labor, materials, equipment, etc.
Indirect cost generally represents overhead cost such as supervision, administration, consultants, and interests. Direct cost grows at an increasing rate as the project time is reduced from its original planned time. However, indirect cost continues for the life of the project and any reduction in project time means a reduction in indirect cost. Therefore, there is a trade-off between the time and cost for completing construc
The problem of the high peak to average ratio (PAPR) in OFDM signals is investigated with a brief presentation of the various methods used to reduce the PAPR with special attention to the clipping method. An alternative approach of clipping is presented, where the clipping is performed right after the IFFT stage unlike the conventional clipping that is performed in the power amplifier stage, which causes undesirable out of signal band spectral growth. In the proposed method, there is clipping of samples not clipping of wave, therefore, the spectral distortion is avoided. Coding is required to correct the errors introduced by the clipping and the overall system is tested for two types of modulations, the QPSK as a constant amplitude modul
... Show MoreSurvival analysis is the analysis of data that are in the form of times from the origin of time until the occurrence of the end event, and in medical research, the origin of time is the date of registration of the individual or the patient in a study such as clinical trials to compare two types of medicine or more if the endpoint It is the death of the patient or the disappearance of the individual. The data resulting from this process is called survival times. But if the end is not death, the resulting data is called time data until the event. That is, survival analysis is one of the statistical steps and procedures for analyzing data when the adopted variable is time to event and time. It could be d
... Show More