Attacking a transferred data over a network is frequently happened millions time a day. To address this problem, a secure scheme is proposed which is securing a transferred data over a network. The proposed scheme uses two techniques to guarantee a secure transferring for a message. The message is encrypted as a first step, and then it is hided in a video cover. The proposed encrypting technique is RC4 stream cipher algorithm in order to increase the message's confidentiality, as well as improving the least significant bit embedding algorithm (LSB) by adding an additional layer of security. The improvement of the LSB method comes by replacing the adopted sequential selection by a random selection manner of the frames and the pixels with two secret random keys. Therefore, the hidden message remains protected even if the stego-object is hacked because the attacker is unable to know the correct frames and pixels that hold each bit of the secret message in addition to difficulty to successfully rebuild the message. The results refer to that the proposed scheme provides a good performance for evaluation metric that is used in this purpose when compared to a large number of related previous methods.
Ziegler and Nichols proposed the well-known Ziegler-Nichols method to tune the coefficients of PID controller. This tuning method is simple and gives fixed values for the coefficients which make PID controller have weak adaptabilities for the model parameters variation and changing in operating conditions. In order to achieve adaptive controller, the Neural Network (NN) self-tuning PID control is proposed in this paper which combines conventional PID controller and Neural Network learning capabilities. The proportional, integral and derivative (KP, KI, KD) gains are self tuned on-line by the NN output which is obtained due to the error value on the desired output of the system under control. The conventio
... Show MoreThe research aims to identify the importance of using the style of the cost on the basis of activity -oriented in time TDABC and its role in determining the cost of products more equitably and thus its impact on the policy of allocation of resources through the reverse of the changes that occur on an ongoing basis in the specification of the products and thus the change in the nature and type of operations . The research was conducted at the General Company for Textile Industries Wasit / knitting socks factory was based on research into the hypothesis main of that ( possible to calculate the cost of activities that cause the production through the time it takes to run these activities can then be re- distributed product cost
... Show MoreIn recent years, data centre (DC) networks have improved their rapid exchanging abilities. Software-defined networking (SDN) is presented to alternate the impression of conventional networks by segregating the control plane from the SDN data plane. The SDN presented overcomes the limitations of traditional DC networks caused by the rapidly incrementing amounts of apps, websites, data storage needs, etc. Software-defined networking data centres (SDN-DC), based on the open-flow (OF) protocol, are used to achieve superior behaviour for executing traffic load-balancing (LB) jobs. The LB function divides the traffic-flow demands between the end devices to avoid links congestion. In short, SDN is proposed to manage more operative configur
... Show MoreThe aim of this study is to estimate the parameters and reliability function for kumaraswamy distribution of this two positive parameter (a,b > 0), which is a continuous probability that has many characterstics with the beta distribution with extra advantages.
The shape of the function for this distribution and the most important characterstics are explained and estimated the two parameter (a,b) and the reliability function for this distribution by using the maximum likelihood method (MLE) and Bayes methods. simulation experiments are conducts to explain the behaviour of the estimation methods for different sizes depending on the mean squared error criterion the results show that the Bayes is bet
... Show MoreIn the current paradigms of information technology, cloud computing is the most essential kind of computer service. It satisfies the need for high-volume customers, flexible computing capabilities for a range of applications like as database archiving and business analytics, and the requirement for extra computer resources to provide a financial value for cloud providers. The purpose of this investigation is to assess the viability of doing data audits remotely inside a cloud computing setting. There includes discussion of the theory behind cloud computing and distributed storage systems, as well as the method of remote data auditing. In this research, it is mentioned to safeguard the data that is outsourced and stored in cloud serv
... Show MoreSince the beginning of the last century, the competition for water resources has intensified dramatically, especially between countries that have no agreements in place for water resources that they share. Such is the situation with the Euphrates River which flows through three countries (Turkey, Syria, and Iraq) and represents the main water resource for these countries. Therefore, the comprehensive hydrologic investigation needed to derive optimal operations requires reliable forecasts. This study aims to analysis and create a forecasting model for data generation from Turkey perspective by using the recorded inflow data of Ataturk reservoir for the period (Oct. 1961 - Sep. 2009). Based on 49 years of real inflow data
... Show MoreVisual analytics becomes an important approach for discovering patterns in big data. As visualization struggles from high dimensionality of data, issues like concept hierarchy on each dimension add more difficulty and make visualization a prohibitive task. Data cube offers multi-perspective aggregated views of large data sets and has important applications in business and many other areas. It has high dimensionality, concept hierarchy, vast number of cells, and comes with special exploration operations such as roll-up, drill-down, slicing and dicing. All these issues make data cubes very difficult to visually explore. Most existing approaches visualize a data cube in 2D space and require preprocessing steps. In this paper, we propose a visu
... Show MorePermeability data has major importance work that should be handled in all reservoir simulation studies. The importance of permeability data increases in mature oil and gas fields due to its sensitivity for the requirements of some specific improved recoveries. However, the industry has a huge source of data of air permeability measurements against little number of liquid permeability values. This is due to the relatively high cost of special core analysis.
The current study suggests a correlation to convert air permeability data that are conventionally measured during laboratory core analysis into liquid permeability. This correlation introduces a feasible estimation in cases of data loose and poorly consolidated formations, or in cas
The using of the parametric models and the subsequent estimation methods require the presence of many of the primary conditions to be met by those models to represent the population under study adequately, these prompting researchers to search for more flexible parametric models and these models were nonparametric, many researchers, are interested in the study of the function of permanence and its estimation methods, one of these non-parametric methods.
For work of purpose statistical inference parameters around the statistical distribution for life times which censored data , on the experimental section of this thesis has been the comparison of non-parametric methods of permanence function, the existence
... Show More