Determination of the sites of geographical coordinates with high accuracy and in short time is very important in many applications, including: air and sea navigation, and in the uses geodetic surveys. Today, the Global Positioning System (GPS) plays an important role in performing this task. The datum used for GPS positioning is called World Geodetic System 1984 (WGS84). It consists of a three-dimensional Cartesian coordinate system and an associated ellipsoid so that WGS84 positions describe coordinates as latitude, longitude and ellipsoid height (h) coordinates, with respect to the center of mass of the Earth This study develops a mathematical model for geomantic measurement correction for ellipsoidal heights (h) between two different receivers of different accuracies (i.e. high and low). The results are examined using statistical analysis for the accuracy and reliability of the computed positions. The receivers used in this study were, the Topcon HiPer-II and the Garmin eTrex vista. The first receiver use Global Navigation Satellite Systems (GNSS) signals on the L1 and L2 frequencies of both the GPS and GLONASS satellite navigation systems, while the second receiver was the Garmin eTrex vista.
These days, it is crucial to discern between different types of human behavior, and artificial intelligence techniques play a big part in that. The characteristics of the feedforward artificial neural network (FANN) algorithm and the genetic algorithm have been combined to create an important working mechanism that aids in this field. The proposed system can be used for essential tasks in life, such as analysis, automation, control, recognition, and other tasks. Crossover and mutation are the two primary mechanisms used by the genetic algorithm in the proposed system to replace the back propagation process in ANN. While the feedforward artificial neural network technique is focused on input processing, this should be based on the proce
... Show MoreIn this research, the nonparametric technique has been presented to estimate the time-varying coefficients functions for the longitudinal balanced data that characterized by observations obtained through (n) from the independent subjects, each one of them is measured repeatedly by group of specific time points (m). Although the measurements are independent among the different subjects; they are mostly connected within each subject and the applied techniques is the Local Linear kernel LLPK technique. To avoid the problems of dimensionality, and thick computation, the two-steps method has been used to estimate the coefficients functions by using the two former technique. Since, the two-
... Show MoreThis paper present the fast and robust approach of English text encryption and decryption based on Pascal matrix. The technique of encryption the Arabic or English text or both and show the result when apply this method on plain text (original message) and how will form the intelligible plain text to be unintelligible plain text in order to secure information from unauthorized access and from steel information, an encryption scheme usually uses a pseudo-random enecryption key generated by an algorithm. All this done by using Pascal matrix. Encryption and decryption are done by using MATLAB as programming language and notepad ++to write the input text.This paper present the fast and robust approach of English text encryption and decryption b
... Show MoreAn analytical approach based on field data was used to determine the strength capacity of large diameter bored type piles. Also the deformations and settlements were evaluated for both vertical and lateral loadings. The analytical predictions are compared to field data obtained from a proto-type test pile used at Tharthar –Tigris canal Bridge. They were found to be with acceptable agreement of 12% deviation.
Following ASTM standards D1143M-07e1,2010, a test schedule of five loading cycles were proposed for vertical loads and series of cyclic loads to simulate horizontal loading .The load test results and analytical data of 1.95
... Show MoreLongitudinal data is becoming increasingly common, especially in the medical and economic fields, and various methods have been analyzed and developed to analyze this type of data.
In this research, the focus was on compiling and analyzing this data, as cluster analysis plays an important role in identifying and grouping co-expressed subfiles over time and employing them on the nonparametric smoothing cubic B-spline model, which is characterized by providing continuous first and second derivatives, resulting in a smoother curve with fewer abrupt changes in slope. It is also more flexible and can pick up on more complex patterns and fluctuations in the data.
The longitudinal balanced data profile was compiled into subgroup
... Show MoreThe paired sample t-test is a type of classical test statistics that is used to test the difference between two means in paired data, but it is not robust against the violation of the normality assumption. In this paper, some alternative robust tests are suggested by combining the Jackknife resampling with each of the Wilcoxon signed-rank test for small sample size and Wilcoxon signed-rank test for large sample size, using normal approximation. The Monte Carlo simulation experiments were employed to study the performance of the test statistics of each of these tests depending on the type one error rates and the power rates of the test statistics. All these tests were applied on different sa
... Show MoreMachine learning has a significant advantage for many difficulties in the oil and gas industry, especially when it comes to resolving complex challenges in reservoir characterization. Permeability is one of the most difficult petrophysical parameters to predict using conventional logging techniques. Clarifications of the work flow methodology are presented alongside comprehensive models in this study. The purpose of this study is to provide a more robust technique for predicting permeability; previous studies on the Bazirgan field have attempted to do so, but their estimates have been vague, and the methods they give are obsolete and do not make any concessions to the real or rigid in order to solve the permeability computation. To
... Show MoreThe investigative film is part of the documentary films and is considered one of the important films that have a high viewership by the recipient, which is of great importance in transmitting information and contributing to creating awareness of the community through the advertising function performed by the cinematic image, as the researcher addresses the concept of advertising and its development through the time stages. And advertising functions. Then the researcher dealt in his research with the cinematic technical elements that contribute to the success of the investigative advertising work, such as camera movements, music, sound effects, dialogue ... etc., which enrich the investigative picture with realism and the flow of informat
... Show MoreThe agent-based modeling is currently utilized extensively to analyze complex systems. It supported such growth, because it was able to convey distinct levels of interaction in a complex detailed environment. Meanwhile, agent-based models incline to be progressively complex. Thus, powerful modeling and simulation techniques are needed to address this rise in complexity. In recent years, a number of platforms for developing agent-based models have been developed. Actually, in most of the agents, often discrete representation of the environment, and one level of interaction are presented, where two or three are regarded hardly in various agent-based models. The key issue is that modellers work in these areas is not assisted by simulation plat
... Show MoreSemantic segmentation realization and understanding is a stringent task not just for computer vision but also in the researches of the sciences of earth, semantic segmentation decompose compound architectures in one elements, the most mutual object in a civil outside or inside senses must classified then reinforced with information meaning of all object, it’s a method for labeling and clustering point cloud automatically. Three dimensions natural scenes classification need a point cloud dataset to representation data format as input, many challenge appeared with working of 3d data like: little number, resolution and accurate of three Dimensional dataset . Deep learning now is the po