In this work, results from an optical technique (laser speckle technique) for measuring surface roughness was done by using statistical properties of speckle pattern from the point of view of computer image texture analysis. Four calibration relationships were used to cover wide range of measurement with the same laser speckle technique. The first one is based on intensity contrast of the speckle, the second is based on analysis of speckle binary image, the third is on size of speckle pattern spot, and the latest one is based on characterization of the energy feature of the gray level co-occurrence matrices for the speckle pattern. By these calibration relationships surface roughness of an object surface can be evaluated within these relations ranges from single speckle pattern image which was taken from the surface.
The present work reports a direct experimental comparison of the catalytic hydrodesulfurization of
thiophene over Co-Mo/Al2O3 in fixed- and fluidized-bed reactors under the same conditions. An
experimental pilot plant scale was constructed in the laboratories of chemical engineering department,
Baghdad University; fixed-bed unit (2.54 cm diameter, and 60cm length) and fluidized-bed unit (diameter of 2.54 cm and 40 cm long with a separation zone of 30 cm long and 12.7 cm diameter). The affecting
variables studied in the two systems were reaction temperature of (308 – 460) oC, Liquid hourly space
velocity of (2 – 5) hr-1, and catalyst particle size of (0.075-0.5) mm. It was found in both operations that the
conversion
Database is characterized as an arrangement of data that is sorted out and disseminated in a way that allows the client to get to the data being put away in a simple and more helpful way. However, in the era of big-data the traditional methods of data analytics may not be able to manage and process the large amount of data. In order to develop an efficient way of handling big-data, this work studies the use of Map-Reduce technique to handle big-data distributed on the cloud. This approach was evaluated using Hadoop server and applied on EEG Big-data as a case study. The proposed approach showed clear enhancement for managing and processing the EEG Big-data with average of 50% reduction on response time. The obtained results provide EEG r
... Show MoreThis paper presents the ability to use cheap adsorbent (corn leaf) for the removal of Malachite Green (MG) dye from its aqueous solution. A batch mode was used to study several factors, dye concentration (50-150) ppm, adsorbent dosage (0.5-2.5) g/L, contact time (1-4) day, pH (2-10), and temperature (30-60) The results indicated that the removal efficiency increases with the increase of adsorbent dosage and contact time, while inversely proportional to the increase in pH and temperature. An SEM device characterized the adsorbent corn leaves. The adsorption's resulting data were in agreement with Freundlich isotherm according to the regression analysis, and the kinetics data followed pseudo-first-or
... Show MoreProjects suspensions are between the most insistent tasks confronted by the construction field accredited to the sector’s difficulty and its essential delay risk foundations’ interdependence. Machine learning provides a perfect group of techniques, which can attack those complex systems. The study aimed to recognize and progress a wellorganized predictive data tool to examine and learn from delay sources depend on preceding data of construction projects by using decision trees and naïve Bayesian classification algorithms. An intensive review of available data has been conducted to explore the real reasons and causes of construction project delays. The results show that the postpo
Variable selection is an essential and necessary task in the statistical modeling field. Several studies have triedto develop and standardize the process of variable selection, but it isdifficultto do so. The first question a researcher needs to ask himself/herself what are the most significant variables that should be used to describe a given dataset’s response. In thispaper, a new method for variable selection using Gibbs sampler techniqueshas beendeveloped.First, the model is defined, and the posterior distributions for all the parameters are derived.The new variable selection methodis tested usingfour simulation datasets. The new approachiscompared with some existingtechniques: Ordinary Least Squared (OLS), Least Absolute Shrinkage
... Show MoreRecently, the internet has made the users able to transmit the digital media in the easiest manner. In spite of this facility of the internet, this may lead to several threats that are concerned with confidentiality of transferred media contents such as media authentication and integrity verification. For these reasons, data hiding methods and cryptography are used to protect the contents of digital media. In this paper, an enhanced method of image steganography combined with visual cryptography has been proposed. A secret logo (binary image) of size (128x128) is encrypted by applying (2 out 2 share) visual cryptography on it to generate two secret share. During the embedding process, a cover red, green, and blue (RGB) image of size (512
... Show MoreIn this work, an inventive photovoltaic evaporative cooling (PV/EC) hybrid system was constructed and experimentally investigated. The PV/EC hybrid system has the prosperous advantage of producing electrical energy and cooling the PV panel besides providing cooled-humid air. Two cooling techniques were utilized: backside evaporative cooling (case #1) and combined backside evaporative cooling with a front-side water spray technique (case #2). The water spraying on the front side of the PV panel is intermittent to minimize water and power consumption depending on the PV panel temperature. In addition, two pad thicknesses of 5 cm and 10 cm were investigated at three different water flow rates of 1, 2, and 3 lpm. In Case #1,
... Show More