In this paper, the problem of developing turbulent flow in rectangular duct is investigated by obtaining numerical results of the velocity profiles in duct by using large eddy simulation model in two dimensions with different Reynolds numbers, filter equations and mesh sizes. Reynolds numbers range from (11,000) to (110,000) for velocities (1 m/sec) to (50 m/sec) with (56×56), (76×76) and (96×96) mesh sizes with different filter equations. The numerical results of the large eddy simulation model are compared with k-ε model and analytic velocity distribution and validated with experimental data of other researcher. The large eddy simulation model has a good agreement with experimental data for high Reynolds number with the first, second and third mesh sizes and the agreement increase near the wall of the duct. The percentage error for the large eddy simulation model with experimental data of the (56×56) mesh size is less than 18 % and for the (76×76) mesh size is also less than 17% and for the (96×96) mesh size is less than 16 %. The large eddy simulation model show high stability and do not need extra differential equation like the k-ε model and a great saving in time and computer memory was achieved.
A flight simulation programme has been developed on a personal computer using Microsoft
FORTRAN to simulate flight trajectories of a light aircraft by using Six-Degree-of-Freedom
equation of motion. The simulation has been made realistic through pre-programmed the input to
the control surfaces, atmospheric gust during the flight mode. The programme plays an important
role in the evaluation and validation of the aircraft design process. A light aircraft (Cessna 182T)
has been tested through free flight, gliding flight, flight with gust. The results show good trend and
show that the programme could be dependent as a realistic flight test programme.
The penalized least square method is a popular method to deal with high dimensional data ,where the number of explanatory variables is large than the sample size . The properties of penalized least square method are given high prediction accuracy and making estimation and variables selection
At once. The penalized least square method gives a sparse model ,that meaning a model with small variables so that can be interpreted easily .The penalized least square is not robust ,that means very sensitive to the presence of outlying observation , to deal with this problem, we can used a robust loss function to get the robust penalized least square method ,and get robust penalized estimator and
... Show MoreThe Adaptive Optics technique has been developed to obtain the correction of atmospheric seeing. The purpose of this study is to use the MATLAB program to investigate the performance of an AO system with the most recent AO simulation tools, Objected-Oriented Matlab Adaptive Optics (OOMAO). This was achieved by studying the variables that impact image quality correction, such as observation wavelength bands, atmospheric parameters, telescope parameters, deformable mirror parameters, wavefront sensor parameters, and noise parameters. The results presented a detailed analysis of the factors that influence the image correction process as well as the impact of the AO components on that process
Big data analysis has important applications in many areas such as sensor networks and connected healthcare. High volume and velocity of big data bring many challenges to data analysis. One possible solution is to summarize the data and provides a manageable data structure to hold a scalable summarization of data for efficient and effective analysis. This research extends our previous work on developing an effective technique to create, organize, access, and maintain summarization of big data and develops algorithms for Bayes classification and entropy discretization of large data sets using the multi-resolution data summarization structure. Bayes classification and data discretization play essential roles in many learning algorithms such a
... Show MoreSpatial data observed on a group of areal units is common in scientific applications. The usual hierarchical approach for modeling this kind of dataset is to introduce a spatial random effect with an autoregressive prior. However, the usual Markov chain Monte Carlo scheme for this hierarchical framework requires the spatial effects to be sampled from their full conditional posteriors one-by-one resulting in poor mixing. More importantly, it makes the model computationally inefficient for datasets with large number of units. In this article, we propose a Bayesian approach that uses the spectral structure of the adjacency to construct a low-rank expansion for modeling spatial dependence. We propose a pair of computationally efficient estimati
... Show MoreAn analytical approach based on field data was used to determine the strength capacity of large diameter bored type piles. Also the deformations and settlements were evaluated for both vertical and lateral loadings. The analytical predictions are compared to field data obtained from a proto-type test pile used at Tharthar –Tigris canal Bridge. They were found to be with acceptable agreement of 12% deviation.
Following ASTM standards D1143M-07e1,2010, a test schedule of five loading cycles were proposed for vertical loads and series of cyclic loads to simulate horizontal loading .The load test results and analytical data of 1.95
... Show MoreTo accommodate utilities in buildings, different sizes of openings are provided in the web of reinforced concrete deep beams, which cause reductions in the beam strength and stiffness. This paper aims to investigate experimentally and numerically the effectiveness of using carbon fiber reinforced polymer (CFRP) strips, as a strengthening technique, to externally strengthen reinforced concrete continuous deep beams (RCCDBs) with large openings. The experimental work included testing three RCCDBs under five-point bending. A reference specimen was prepared without openings to explore the reductions in strength and stiffness after providing large openings. Openings were created symmetrically at the center of spans of the other specimens
... Show MoreEsterification reaction is most important reaction in biodiesel production. In this study, oleic acid was used as a suggested feedstock to study and simulate production of biodiesel. Batch esterification of oleic acid was carried out at operating conditions; temperature from 40 to 70 °C, ethanol to oleic acid molar ratio from 1/1 to 6/1, H2SO4 as the catalyst 1 and 5% wt of oleic acid, reaction time up to 180 min. The optimum conditions for the esterification reaction were molar ratio of ethanol/oleic acid 6/1, 5%wt H2SO4 relative to oleic acid, 70 °C, 90 min and conversion of oleic 0.92. The activation energy for the suggested model was 26625 J/mole for forward reaction and 42189 J/mole for equilibrium constant. The obtained results s
... Show MorePhotoacoustic is a unique imaging method that combines the absorption contrast of light or radio frequency waves with ultrasound resolution. When the deposition of this energy is sufficiently short, a thermo-elastic expansion takes place whereby acoustic waves are generated. These waves can be recorded and stored to construct an image. This work presents experimental procedure of laser photoacoustic two dimensional imaging to detect tumor embedded within normal tissue. The experimental work is accomplished using phantoms that are sandwiched from fish heart or blood sac (simulating a tumor) 1-14mm mean diameter embedded within chicken breast to simulate a real tissue. Nd: YAG laser of 1.064μm and 532nm wavelengths, 10ns pulse duration, 4
... Show More