The paradigm and domain of data security is the key point as per the current era in which the data is getting transmitted to multiple channels from multiple sources. The data leakage and security loopholes are enormous and there is need to enforce the higher levels of security, privacy and integrity. Such sections incorporate e-administration, long range interpersonal communication, internet business, transportation, coordinations, proficient correspondences and numerous others. The work on security and trustworthiness is very conspicuous in the systems based situations and the private based condition. This examination original copy is exhibiting the efficacious use of security based methodology towards the execution with blockchain programming utilizing robustness and different devices. The blockchain based mix is currently days utilized for e-administrations and military applications for the noticeable security based applications. To work with the high performance approaches and algorithms, the blockchain technology is quite prominent and used in huge performance aware patterns whereby the need to enforce the security is there. The work integrates the usage patterns of blockchain technologies so that the overall security and integrity can be improved in which there is immutability and strength based algorithms for enforce the security measures.
The security of information represent the available protection of information and its component and guaranty the its safety and confidentiality. The absent or lack or stop of security of information and without maximum benefit may lead to lose confidence and make it burden on the company. So we must protect the company and information from the damages which may lead to the frailer of performance and loses of the company and its workers. So the security of information considered one of the potential and controlling basis to protect individuals and companies from the damages. To insure the security and confidentiality of information there are delicate, proper and trusted ways, like FIRE WALL, PASS WORD, SYPHAR
... Show MoreThe development that solar energy will have in the next years needs a reliable estimation of available solar energy resources. Several empirical models have been developed to calculate global solar radiation using various parameters such as extraterrestrial radiation, sunshine hours, albedo, maximum temperature, mean temperature, soil temperature, relative humidity, cloudiness, evaporation, total perceptible water, number of rainy days, and altitude and latitude. In present work i) First part has been calculated solar radiation from the daily values of the hours of sun duration using Angstrom model over the Iraq for at July 2017. The second part has been mapping the distribution of so
Generally, statistical methods are used in various fields of science, especially in the research field, in which Statistical analysis is carried out by adopting several techniques, according to the nature of the study and its objectives. One of these techniques is building statistical models, which is done through regression models. This technique is considered one of the most important statistical methods for studying the relationship between a dependent variable, also called (the response variable) and the other variables, called covariate variables. This research describes the estimation of the partial linear regression model, as well as the estimation of the “missing at random” values (MAR). Regarding the
... Show MorePrecise forecasting of pore pressures is crucial for efficiently planning and drilling oil and gas wells. It reduces expenses and saves time while preventing drilling complications. Since direct measurement of pore pressure in wellbores is costly and time-intensive, the ability to estimate it using empirical or machine learning models is beneficial. The present study aims to predict pore pressure using artificial neural network. The building and testing of artificial neural network are based on the data from five oil fields and several formations. The artificial neural network model is built using a measured dataset consisting of 77 data points of Pore pressure obtained from the modular formation dynamics tester. The input variables
... Show MoreIn this research, the nonparametric technique has been presented to estimate the time-varying coefficients functions for the longitudinal balanced data that characterized by observations obtained through (n) from the independent subjects, each one of them is measured repeatedly by group of specific time points (m). Although the measurements are independent among the different subjects; they are mostly connected within each subject and the applied techniques is the Local Linear kernel LLPK technique. To avoid the problems of dimensionality, and thick computation, the two-steps method has been used to estimate the coefficients functions by using the two former technique. Since, the two-
... Show MoreIn recent years, the Global Navigation Satellite Services (GNSS) technology has been frequently employed for monitoring the Earth crust deformation and movement. Such applications necessitate high positional accuracy that can be achieved through processing GPS/GNSS data with scientific software such as BERENSE, GAMIT, and GIPSY-OSIS. Nevertheless, these scientific softwares are sophisticated and have not been published as free open source software. Therefore, this study has been conducted to evaluate an alternative solution, GNSS online processing services, which may obtain this privilege freely. In this study, eight years of GNSS raw data for TEHN station, which located in Iran, have been downloaded from UNAVCO website
... Show MoreThe aesthetic contents of data visualization is one of the contemporary areas through which data scientists and designers have been able to link data to humans, and even after reaching successful attempts to model data visualization, it wasn't clear how that reveals how it contributed to choosing the aesthetic content as an input to humanize these models, so the goal of the current research is to use The analytical descriptive approach aims to identify the aesthetic contents in data visualization, which the researchers interpreted through pragmatic philosophy and Kantian philosophy, and analyze a sample of data visualization models to reveal the aesthetic entrances in them to explain how to humanize them. The two researchers reached seve
... Show MoreInformation security contributes directly to increase the level of trust between the government’s departments by providing an assurance of confidentiality, integrity, and availability of sensitive governmental information. Many threats that are caused mainly by malicious acts can shutdown the egovernment services. Therefore the governments are urged to implement security in e-government projects.
Some modifications were proposed to the security assessment multi-layer model (Sabri model) to be more comprehensive model and more convenient for the Iraqi government. The proposed model can be used as a tool to assess the level of security readiness of government departments, a checklist for the required security measures and as a commo
An essential tool for studying the web is its ability to show how energy moves through an ecosystem. Understanding and elucidating the relationship between species variety and their placement within the inclusive trophic dynamics is also beneficial. A food web ecological model with prey and two rival predators under fear and wind flow conditions is developed in this article. The boundedness and positivity of the system’s solution are established mathematically. The stability and existence constraints of the system’s equilibria are examined. The proposed system’s persistence limitations are established. Additionally, the bifurcation analysis of every potential equilibrium is examined using the Sotomayor theorem. To describe the
... Show MoreA new distribution, the Epsilon Skew Gamma (ESΓ ) distribution, which was first introduced by Abdulah [1], is used on a near Gamma data. We first redefine the ESΓ distribution, its properties, and characteristics, and then we estimate its parameters using the maximum likelihood and moment estimators. We finally use these estimators to fit the data with the ESΓ distribution