Pilots are trained using computerized flight simulators. A flight simulator is a training system where pilots can acquire flying skills without need to practice on a real airplane. Simulators are used by professional pilots to practice flying strategies under emergency or hazardous conditions, or to train on new aircraft types. In this study a framework for flight simulation is presented and the layout of an implemented program is described. The calculations were based on simple theoretical approach. The implementation was based on utilizing some of utilities supported by ActiveX, DirectX and OpenGL written in Visual C++. The main design consideration is to build a simple flight simulation program can operate without need to high computer environment specifications.
The cyanobacterial neurotoxin
Many carbonate reservoirs in the world show a tilted in originally oil-water contact (OOWC) which requires a special consideration in the selection of the capillary pressure curves and an understanding of reservoir fluids distribution while initializing the reservoir simulation models.
An analytical model for predicting the capillary pressure across the interface that separates two immiscible fluids was derived from reservoir pressure transient analysis. The model reflected the entire interaction between the reservoir-aquifer fluids and rock properties measured under downhole reservoir conditions.
This model retained the natural coupling of oil reservoirs with the aquifer zone and treated them as an explicit-region composite system
In this paper, a new method of selection variables is presented to select some essential variables from large datasets. The new model is a modified version of the Elastic Net model. The modified Elastic Net variable selection model has been summarized in an algorithm. It is applied for Leukemia dataset that has 3051 variables (genes) and 72 samples. In reality, working with this kind of dataset is not accessible due to its large size. The modified model is compared to some standard variable selection methods. Perfect classification is achieved by applying the modified Elastic Net model because it has the best performance. All the calculations that have been done for this paper are in
End of the twentieth century witnessed by the technological evolution Convergences between the visual arts aesthetic value and objective representation of the image in the composition of the design of the fabric of new insights and unconventional potential in atypical employment. It is through access to the designs of modern fabrics that address the employment picture footage included several scenes footage from the film, which focuses on research and analytical as a study to demonstrate the elements of the picture and the organization of its rules and how to functioning in the design of fabrics, Thus, it has identified the problem by asking the following: What are the elements of the picture footage and how the functioning of the struct
... Show Moremodel is derived, and the methodology is given in detail. The model is constructed depending on some measurement criteria, Akaike and Bayesian information criterion. For the new time series model, a new algorithm has been generated. The forecasting process, one and two steps ahead, is discussed in detail. Some exploratory data analysis is given in the beginning. The best model is selected based on some criteria; it is compared with some naïve models. The modified model is applied to a monthly chemical sales dataset (January 1992 to Dec 2019), where the dataset in this work has been downloaded from the United States of America census (www.census.gov). Ultimately, the forecasted sales
Facial recognition has been an active field of imaging science. With the recent progresses in computer vision development, it is extensively applied in various areas, especially in law enforcement and security. Human face is a viable biometric that could be effectively used in both identification and verification. Thus far, regardless of a facial model and relevant metrics employed, its main shortcoming is that it requires a facial image, against which comparison is made. Therefore, closed circuit televisions and a facial database are always needed in an operational system. For the last few decades, unfortunately, we have experienced an emergence of asymmetric warfare, where acts of terrorism are often committed in secluded area with no
... Show MoreObjective: To conduct a standardized method for cavity preparation on the palatal surface of rat maxillary molars and to introduce a standardized method for tooth correct alignment within the specimen during the wax embedding procedure to better detect cavity position within the examined slides. Materials and methods: Six male Wistar rats, aged 4-6 weeks, were used. The maxillary molars of three animals were sectioned in the frontal plane to identify the thickness of hard tissue on the palatal surface of the first molar which was (250-300µm). The end-cutting bur (with a cutting head diameter of 0.2mm) was suitable for preparing a dentinal cavity (70-80µm) depth. Cavity preparation was then performed using the same bur on the tooth surf
... Show MoreA multivariate multisite hydrological data forecasting model was derived and checked using a case study. The philosophy is to use simultaneously the cross-variable correlations, cross-site correlations and the time lag correlations. The case study is of two variables, three sites, the variables are the monthly rainfall and evaporation; the sites are Sulaimania, Dokan, and Darbandikhan.. The model form is similar to the first order auto regressive model, but in matrices form. A matrix for the different relative correlations mentioned above and another for their relative residuals were derived and used as the model parameters. A mathematical filter was used for both matrices to obtain the elements. The application of this model indicates i
... Show MoreExtractive multi-document text summarization – a summarization with the aim of removing redundant information in a document collection while preserving its salient sentences – has recently enjoyed a large interest in proposing automatic models. This paper proposes an extractive multi-document text summarization model based on genetic algorithm (GA). First, the problem is modeled as a discrete optimization problem and a specific fitness function is designed to effectively cope with the proposed model. Then, a binary-encoded representation together with a heuristic mutation and a local repair operators are proposed to characterize the adopted GA. Experiments are applied to ten topics from Document Understanding Conference DUC2002 datas
... Show More