Digital Elevation Model (DEM) is one of the developed techniques for relief representation. The definition of a DEM construction is the modeling technique of earth surface from existing data. DEM plays a role as one of the fundamental information requirement that has been generally utilized in GIS data structures. The main aim of this research is to present a methodology for assessing DEMs generation methods. The DEMs data will be extracted from open source data e.g. Google Earth. The tested data will be compared with data produced from formal institutions such as General Directorate of Surveying. The study area has been chosen in south of Iraq (Al-Gharraf / Dhi Qar governorate. The methods of DEMs creation are kriging, IDW (inverse distance weight), spline, and natural neighbor. This research used different software for processing and analysis such as ArcGIS 10.2, TCX and Civil 3D. Two- sample t-test has been adopted to investigate the mean of elevation differences between compared datasets. The results showed that the spline is the best method that can be used to build DEM in this study area.
Background: Hyperfunction of the muscles of the upper lip is considered as the most common cause of excessive gingival display (EGD). The aim of this study was to demonstrate the effectiveness of botulinum toxin (BT) injection as a conservative treatment for EGD due to muscular hyperfunction and to compare the outcome of 2 injection methods. Material and methods: This study included 40 participants who were randomly assigned into 2 groups of 20 each, The first group received 2.5IU BT injection at 1 point per side (2-points group), while the second group received a total of 5 IU of BT at 2 points per side (4-points group). The outcome variables were the reduction in the central and lateral gingival display expressed as the difference between
... Show MoreA mixture model is used to model data that come from more than one component. In recent years, it became an effective tool in drawing inferences about the complex data that we might come across in real life. Moreover, it can represent a tremendous confirmatory tool in classification observations based on similarities amongst them. In this paper, several mixture regression-based methods were conducted under the assumption that the data come from a finite number of components. A comparison of these methods has been made according to their results in estimating component parameters. Also, observation membership has been inferred and assessed for these methods. The results showed that the flexible mixture model outperformed the
... Show MoreA simple setup of random number generator is proposed. The random number generation is based on the shot-noise fluctuations in a p-i-n photodiode. These fluctuations that are defined as shot noise are based on a stationary random process whose statistical properties reflect Poisson statistics associated with photon streams. It has its origin in the quantum nature of light and it is related to vacuum fluctuations. Two photodiodes were used and their shot noise fluctuations were subtracted. The difference was applied to a comparator to obtain the random sequence.
In this paper reliable computational methods (RCMs) based on the monomial stan-dard polynomials have been executed to solve the problem of Jeffery-Hamel flow (JHF). In addition, convenient base functions, namely Bernoulli, Euler and Laguerre polynomials, have been used to enhance the reliability of the computational methods. Using such functions turns the problem into a set of solvable nonlinear algebraic system that MathematicaⓇ12 can solve. The JHF problem has been solved with the help of Improved Reliable Computational Methods (I-RCMs), and a review of the methods has been given. Also, published facts are used to make comparisons. As further evidence of the accuracy and dependability of the proposed methods, the maximum error remainder
... Show MoreIn this work, the notion is defined by using and some properties of this set are studied also, and Ù€ set are two concepts that are defined by using ; many examples have been cited to indicate that the reverse of the propositions and remarks is not achieved. In addition, new application example of nano was studied.
Exponential distribution is one of most common distributions in studies and scientific researches with wide application in the fields of reliability, engineering and in analyzing survival function therefore the researcher has carried on extended studies in the characteristics of this distribution.
In this research, estimation of survival function for truncated exponential distribution in the maximum likelihood methods and Bayes first and second method, least square method and Jackknife dependent in the first place on the maximum likelihood method, then on Bayes first method then comparing then using simulation, thus to accomplish this task, different size samples have been adopted by the searcher us
... Show More
The optimum balance values for different coefficient of spherical aberration (third and fifth degree also focal shift) were studied, the optical system includes different apertures (circle, ellipse, square and triangle) using point spread function (PSF). By using (Marechal) method; the minimum value of mean square of variance in wave front was founded, so we can get the maximum of central intensity according to (Strehl) criterion.
An image retrieval system is a computer system for browsing, looking and recovering pictures from a huge database of advanced pictures. The objective of Content-Based Image Retrieval (CBIR) methods is essentially to extract, from large (image) databases, a specified number of images similar in visual and semantic content to a so-called query image. The researchers were developing a new mechanism to retrieval systems which is mainly based on two procedures. The first procedure relies on extract the statistical feature of both original, traditional image by using the histogram and statistical characteristics (mean, standard deviation). The second procedure relies on the T-
... Show More