Determination of the sites of geographical coordinates with high accuracy and in short time is very important in many applications, including: air and sea navigation, and in the uses geodetic surveys. Today, the Global Positioning System (GPS) plays an important role in performing this task. The datum used for GPS positioning is called World Geodetic System 1984 (WGS84). It consists of a three-dimensional Cartesian coordinate system and an associated ellipsoid so that WGS84 positions describe coordinates as latitude, longitude and ellipsoid height (h) coordinates, with respect to the center of mass of the Earth This study develops a mathematical model for geomantic measurement correction for ellipsoidal heights (h) between two different receivers of different accuracies (i.e. high and low). The results are examined using statistical analysis for the accuracy and reliability of the computed positions. The receivers used in this study were, the Topcon HiPer-II and the Garmin eTrex vista. The first receiver use Global Navigation Satellite Systems (GNSS) signals on the L1 and L2 frequencies of both the GPS and GLONASS satellite navigation systems, while the second receiver was the Garmin eTrex vista.
In this paper, a method for data encryption was proposed using two secret keys, where the first one is a matrix of XOR's and NOT's gates (XN key), whereas the second key is a binary matrix (KEYB) key. XN and KEYB are (m*n) matrices where m is equal to n. Furthermore this paper proposed a strategy to generate secret keys (KEYBs) using the concept of the LFSR method (Linear Feedback Shift Registers) depending on a secret start point (third secret key s-key). The proposed method will be named as X.K.N. (X.K.N) is a type of symmetric encryption and it will deal with the data as a set of blocks in its preprocessing and then encrypt the binary data in a case of stream cipher.
Big data analysis is essential for modern applications in areas such as healthcare, assistive technology, intelligent transportation, environment and climate monitoring. Traditional algorithms in data mining and machine learning do not scale well with data size. Mining and learning from big data need time and memory efficient techniques, albeit the cost of possible loss in accuracy. We have developed a data aggregation structure to summarize data with large number of instances and data generated from multiple data sources. Data are aggregated at multiple resolutions and resolution provides a trade-off between efficiency and accuracy. The structure is built once, updated incrementally, and serves as a common data input for multiple mining an
... Show MoreIn the presence of deep submicron noise, providing reliable and energy‐efficient network on‐chip operation is becoming a challenging objective. In this study, the authors propose a hybrid automatic repeat request (HARQ)‐based coding scheme that simultaneously reduces the crosstalk induced bus delay and provides multi‐bit error protection while achieving high‐energy savings. This is achieved by calculating two‐dimensional parities and duplicating all the bits, which provide single error correction and six errors detection. The error correction reduces the performance degradation caused by retransmissions, which when combined with voltage swing reduction, due to its high error detection, high‐energy savings are achieved. The res
... Show MoreEvaporation is one of the major components of the hydrological cycle in the nature, thus its accurate estimation is so important in the planning and management of the irrigation practices and to assess water availability and requirements. The aim of this study is to investigate the ability of fuzzy inference system for estimating monthly pan evaporation form meteorological data. The study has been carried out depending on 261 monthly measurements of each of temperature (T), relative humidity (RH), and wind speed (W) which have been available in Emara meteorological station, southern Iraq. Three different fuzzy models comprising various combinations of monthly climatic variables (temperature, wind speed, and relative humidity) were developed
... Show More
It is considered as one of the statistical methods used to describe and estimate the relationship between randomness (Y) and explanatory variables (X). The second is the homogeneity of the variance, in which the dependent variable is a binary response takes two values (One when a specific event occurred and zero when that event did not happen) such as (injured and uninjured, married and unmarried) and that a large number of explanatory variables led to the emergence of the problem of linear multiplicity that makes the estimates inaccurate, and the method of greatest possibility and the method of declination of the letter was used in estimating A double-response logistic regression model by adopting the Jackna
... Show MoreConstruction contractors usually undertake multiple construction projects simultaneously. Such a situation involves sharing different types of resources, including monetary, equipment, and manpower, which may become a major challenge in many cases. In this study, the financial aspects of working on multiple projects at a time are addressed and investigated. The study considers dealing with financial shortages by proposing a multi-project scheduling optimization model for profit maximization, while minimizing the total project duration. Optimization genetic algorithm and finance-based scheduling are used to produce feasible schedules that balance the finance of activities at any time w
It is considered as one of the statistical methods used to describe and estimate the relationship between randomness (Y) and explanatory variables (X). The second is the homogeneity of the variance, in which the dependent variable is a binary response takes two values (One when a specific event occurred and zero when that event did not happen) such as (injured and uninjured, married and unmarried) and that a large number of explanatory variables led to the emergence of the problem of linear multiplicity that makes the estimates inaccurate, and the method of greatest possibility and the method of declination of the letter was used in estimating A double-response logistic regression model by adopting the Jackna
... Show MoreA new derivative applied to the old gravity Bouguer map (served in 1940s and 1950s), taking regional study area covered the mid and south of Iraq. The gravity anomaly reflects a density contrast variation; therefore it is possible to use gravity inversion to the density and velocity model through layers (615m, 1100m, 1910m, 2750m and 5290m), the depth layers according to the power spectrum analysis of gravity Bouguer. The inversion is according to the integration of gravity anomalies of the each depth layer with the same depth of wells data, considered to the estimations and analysis of density and velocity scatters of the oil wells distribution with depth at the regional area. Taking the relation
... Show More