Portable devices such as smartphones, tablet PCs, and PDAs are a useful combination of hardware and software turned toward the mobile workers. While they present the ability to review documents, communicate via electronic mail, appointments management, meetings, etc. They usually lack a variety of essential security features. To address the security concerns of sensitive data, many individuals and organizations, knowing the associated threats mitigate them through improving authentication of users, encryption of content, protection from malware, firewalls, intrusion prevention, etc. However, no standards have been developed yet to determine whether such mobile data management systems adequately provide the fundamental security functions demanded by organizations and whether these functions have been securely developed. Therefore, this paper proposes a security framework for mobile data that combines core security mechanisms to avoid these problems and protects sensitive information without spending time and money deploying several new applications.
Construction projects have a special nature and affect them many factors making them exposed to multiple risks as a result of the length of the implementation period and the multiplicity of stages, starting from the decision stage through implementation until the final delivery, which leads to increased uncertainty and the likelihood of risk.
The process of analysis and risk management is one of the effective and productive methods that are used in managing the construction projects for the purpose of increasing the chances of ending the project successfully in terms of cost, time and quality and at the lowest possible problems.
The research aims first to the effective planning for analysis and risk managemen
... Show MoreThe estimation of the regular regression model requires several assumptions to be satisfied such as "linearity". One problem occurs by partitioning the regression curve into two (or more) parts and then joining them by threshold point(s). This situation is regarded as a linearity violation of regression. Therefore, the multiphase regression model is received increasing attention as an alternative approach which describes the changing of the behavior of the phenomenon through threshold point estimation. Maximum likelihood estimator "MLE" has been used in both model and threshold point estimations. However, MLE is not resistant against violations such as outliers' existence or in case of the heavy-tailed error distribution. The main goal of t
... Show MoreOne of the costliest problems facing the production of hydrocarbons in unconsolidated sandstone reservoirs is the production of sand once hydrocarbon production starts. The sanding start prediction model is very important to decide on sand control in the future, including whether or when sand control should be used. This research developed an easy-to-use Computer program to determine the beginning of sanding sites in the driven area. The model is based on estimating the critical pressure drop that occurs when sand is onset to produced. The outcomes have been drawn as a function of the free sand production with the critical flow rates for reservoir pressure decline. The results show that the pressure drawdown required to
... Show MoreThis study focused on spectral clustering (SC) and three-constraint affinity matrix spectral clustering (3CAM-SC) to determine the number of clusters and the membership of the clusters of the COST 2100 channel model (C2CM) multipath dataset simultaneously. Various multipath clustering approaches solve only the number of clusters without taking into consideration the membership of clusters. The problem of giving only the number of clusters is that there is no assurance that the membership of the multipath clusters is accurate even though the number of clusters is correct. SC and 3CAM-SC aimed to solve this problem by determining the membership of the clusters. The cluster and the cluster count were then computed through the cluster-wise J
... Show MoreIn this paper we present the theoretical foundation of forward error analysis of numerical algorithms under;• Approximations in "built-in" functions.• Rounding errors in arithmetic floating-point operations.• Perturbations of data.The error analysis is based on linearization method. The fundamental tools of the forward error analysis are system of linear absolute and relative a prior and a posteriori error equations and associated condition numbers constituting optimal of possible cumulative round – off errors. The condition numbers enable simple general, quantitative bounds definitions of numerical stability. The theoretical results have been applied a Gaussian elimination, and have proved to be very effective means of both a prior
... Show MoreIntegrating Renewable Energy (RE) into Distribution Power Networks (DPNs) is a choice for efficient and sustainable electricity. Controlling the power factor of these sources is one of the techniques employed to manage the power loss of the grid. Capacitor banks have been employed to control phantom power, improving voltage and reducing power losses for several decades. The voltage sag and the significant power losses in the Iraqi DPN make it good evidence to be a case study proving the efficiency enhancement by adjusting the RE power factor. Therefore, this paper studies a part of the Iraqi network in a windy and sunny region, the Badra-Zurbatya-11 kV feeder, in the Wasit governorate. A substation of hybrid RE sources is connected to this
... Show MoreObjective: evaluation of Acute Flaccid Paralysis Surveillance (AFP) System's Structure at Al-Russafa Health directorate in Baghdad City. Methodology: descriptive study using evaluation approach conducted to measure the efficiency of AFP Surveillance System structure for period from November 27th 2014 to June 30th 2015. The study adopted the non-probability multi-stage sampling approach. As nineteen health facilities under surveillance are chosen and interview is conducted with a total of 50 health worker how are involved in the AFP Surveillance System. The data are gathered from sample by using question
This study produces an image of theoretical and experimental case of high loading stumbling condition for hip prosthesis. Model had been studied namely Charnley. This model was modeled with finite element method by using ANSYS software, the effect of changing the design parameters (head diameter, neck length, neck ratio, stem length) on Charnley design, for stumbling case as impact load where the load reach to (8.7* body weight) for impact duration of 0.005sec.An experimental rig had been constructed to test the hip model, this rig consist of a wood box with a smooth sliding shaft where a load of 1 pound is dropped from three heights.
The strain produced by this impact is measured by using rosette strain gauge connected to Wheatstone
In this paper a new method is proposed to perform the N-Radon orthogonal frequency division multiplexing (OFDM), which are equivalent to 4-quadrature amplitude modulation (QAM), 16-QAM, 64-QAM, 256-QAM, ... etc. in spectral efficiency. This non conventional method is proposed in order to reduce the constellation energy and increase spectral efficiency. The proposed method gives a significant improvement in Bit Error Rate performance, and keeps bandwidth efficiency and spectrum shape as good as conventional Fast Fourier Transform based OFDM. The new structure was tested and compared with conventional OFDM for Additive White Gaussian Noise, flat, and multi-path selective fading channels. Simulation tests were generated for different channels
... Show MoreThe dependable and efficient identification of Qin seal script characters is pivotal in the discovery, preservation, and inheritance of the distinctive cultural values embodied by these artifacts. This paper uses image histograms of oriented gradients (HOG) features and an SVM model to discuss a character recognition model for identifying partial and blurred Qin seal script characters. The model achieves accurate recognition on a small, imbalanced dataset. Firstly, a dataset of Qin seal script image samples is established, and Gaussian filtering is employed to remove image noise. Subsequently, the gamma transformation algorithm adjusts the image brightness and enhances the contrast between font structures and image backgrounds. After a s
... Show More