Digital Elevation Model (DEM) is one of the developed techniques for relief representation. The definition of a DEM construction is the modeling technique of earth surface from existing data. DEM plays a role as one of the fundamental information requirement that has been generally utilized in GIS data structures. The main aim of this research is to present a methodology for assessing DEMs generation methods. The DEMs data will be extracted from open source data e.g. Google Earth. The tested data will be compared with data produced from formal institutions such as General Directorate of Surveying. The study area has been chosen in south of Iraq (Al-Gharraf / Dhi Qar governorate. The methods of DEMs creation are kriging, IDW (inverse distance weight), spline, and natural neighbor. This research used different software for processing and analysis such as ArcGIS 10.2, TCX and Civil 3D. Two- sample t-test has been adopted to investigate the mean of elevation differences between compared datasets. The results showed that the spline is the best method that can be used to build DEM in this study area.
In this work we present a technique to extract the heart contours from noisy echocardiograph images. Our technique is based on improving the image before applying contours detection to reduce heavy noise and get better image quality. To perform that, we combine many pre-processing techniques (filtering, morphological operations, and contrast adjustment) to avoid unclear edges and enhance low contrast of echocardiograph images, after implementing these techniques we can get legible detection for heart boundaries and valves movement by traditional edge detection methods.
In this paper, a fast lossless image compression method is introduced for compressing medical images, it is based on splitting the image blocks according to its nature along with using the polynomial approximation to decompose image signal followed by applying run length coding on the residue part of the image, which represents the error caused by applying polynomial approximation. Then, Huffman coding is applied as a last stage to encode the polynomial coefficients and run length coding. The test results indicate that the suggested method can lead to promising performance.
Cost is the essence of any production process for it is one of the requirements for the continuity of activities so as to increase the profitability of the economic unit and to support the competitive situation in the market. Therefore, there should be an overall control to reduce the cost without compromising the product quality; to achieve this, the management should have detailed credible and reliable information about the cost to be measured, collected, understood and to analyze the causes for the spread of deviations and obstacles the management faces, and to search for the factors that trigger the emergence of these deviations and obstacles
The research deals with a very important topic, which is social security viewed in the context of criminal protection for state security and the challenges it faces after a decisive change in the methods of war. The research also presents a different division of the generations of wars. We limit ourselves to four of them based on the change in the strategic war objectives and not just the means of committing them. This is because these means are not suitable for describing the real changes in the patterns of wars and the goals that it seeks to achieve. The research stresses the importance of putting the concept of state security in its correct framework, which is part of social security, so that the interest of the political system and the
... Show MoreThis article aims to determine the time-dependent heat coefficient together with the temperature solution for a type of semi-linear time-fractional inverse source problem by applying a method based on the finite difference scheme and Tikhonov regularization. An unconditionally stable implicit finite difference scheme is used as a direct (forward) solver. While by the MATLAB routine lsqnonlin from the optimization toolbox, the inverse problem is reformulated as nonlinear least square minimization and solved efficiently. Since the problem is generally incorrect or ill-posed that means any error inclusion in the input data will produce a large error in the output data. Therefore, the Tikhonov regularization technique is applie
... Show MoreThe purpose of this study is to assess the level of computer literacy of the students at AlQuds open university in Tulkarm Branch. The study investigates the effects of the variables of gender, specialization, academic level, owning a personal computer, internet communication, and own an email account on the level of computer literacy. A test of computer literacy was constructed by the researchers consisting of (37) multiple choice items with a reliability rate (0.85). The study population consisted of (4100) students, while the sample contained (352) students. The study revealed that the level of computer literacy of the students at AlQuds Open University in Tulkarm Branch was (66.2 %) which is educationally acceptable. It also
... Show More