The advancements in Information and Communication Technology (ICT), within the previous decades, has significantly changed people’s transmit or store their information over the Internet or networks. So, one of the main challenges is to keep these information safe against attacks. Many researchers and institutions realized the importance and benefits of cryptography in achieving the efficiency and effectiveness of various aspects of secure communication.This work adopts a novel technique for secure data cryptosystem based on chaos theory. The proposed algorithm generate 2-Dimensional key matrix having the same dimensions of the original image that includes random numbers obtained from the 1-Dimensional logistic chaotic map for given control parameters, which is then processed by converting the fractional parts of them through a function into a set of non-repeating numbers that leads to a vast number of unpredicted probabilities (the factorial of rows times columns). Double layers of rows and columns permutation are made to the values of numbers for a specified number of stages. Then, XOR is performed between the key matrix and the original image, which represent an active resolve for data encryption for any type of files (text, image, audio, video, … etc). The results proved that the proposed encryption technique is very promising when tested on more than 500 image samples according to security measurements where the histograms of cipher images are very flatten compared with that for original images, while the averages of Mean Square Error is very high (10115.4) and Peak Signal to Noise Ratio is very low (8.17), besides Correlation near zero and Entropy close to 8 (7.9975).
Smart cities have recently undergone a fundamental evolution that has greatly increased their potentials. In reality, recent advances in the Internet of Things (IoT) have created new opportunities by solving a number of critical issues that are allowing innovations for smart cities as well as the creation and computerization of cutting-edge services and applications for the many city partners. In order to further the development of smart cities toward compelling sharing and connection, this study will explore the information innovation in smart cities in light of the Internet of Things (IoT) and cloud computing (CC). IoT data is first collected in the context of smart cities. The data that is gathered is uniform. The Internet of Things,
... Show MoreClassification of imbalanced data is an important issue. Many algorithms have been developed for classification, such as Back Propagation (BP) neural networks, decision tree, Bayesian networks etc., and have been used repeatedly in many fields. These algorithms speak of the problem of imbalanced data, where there are situations that belong to more classes than others. Imbalanced data result in poor performance and bias to a class without other classes. In this paper, we proposed three techniques based on the Over-Sampling (O.S.) technique for processing imbalanced dataset and redistributing it and converting it into balanced dataset. These techniques are (Improved Synthetic Minority Over-Sampling Technique (Improved SMOTE), Border
... Show MoreModeling data acquisition systems (DASs) can support the vehicle industry in the development and design of sophisticated driver assistance systems. Modeling DASs on the basis of multiple criteria is considered as a multicriteria decision-making (MCDM) problem. Although literature reviews have provided models for DASs, the issue of imprecise, unclear, and ambiguous information remains unresolved. Compared with existing MCDM methods, the robustness of the fuzzy decision by opinion score method II (FDOSM II) and fuzzy weighted with zero inconsistency II (FWZIC II) is demonstrated for modeling the DASs. However, these methods are implemented in an intuitionistic fuzzy set environment that restricts the ability of experts to provide mem
... Show MoreAbstract
The research Compared two methods for estimating fourparametersof the compound exponential Weibull - Poisson distribution which are the maximum likelihood method and the Downhill Simplex algorithm. Depending on two data cases, the first one assumed the original data (Non-polluting), while the second one assumeddata contamination. Simulation experimentswere conducted for different sample sizes and initial values of parameters and under different levels of contamination. Downhill Simplex algorithm was found to be the best method for in the estimation of the parameters, the probability function and the reliability function of the compound distribution in cases of natural and contaminateddata.
... Show More
Measuring the efficiency of postgraduate and undergraduate programs is one of the essential elements in educational process. In this study, colleges of Baghdad University and data for the academic year (2011-2012) have been chosen to measure the relative efficiencies of postgraduate and undergraduate programs in terms of their inputs and outputs. A relevant method to conduct the analysis of this data is Data Envelopment Analysis (DEA). The effect of academic staff to the number of enrolled and alumni students to the postgraduate and undergraduate programs are the main focus of the study.
FG Mohammed, HM Al-Dabbas, Iraqi journal of science, 2018 - Cited by 6
Business organizations have faced many challenges in recent times, most important of which is information technology, because it is widely spread and easy to use. Its use has led to an increase in the amount of data that business organizations deal with an unprecedented manner. The amount of data available through the internet is a problem that many parties seek to find solutions for. Why is it available there in this huge amount randomly? Many expectations have revealed that in 2017, there will be devices connected to the internet estimated at three times the population of the Earth, and in 2015 more than one and a half billion gigabytes of data was transferred every minute globally. Thus, the so-called data mining emerged as a
... Show MoreThe research Concentrates on modern Variable in the organizations that is 6 sigma. The field study is two of Iraqi industrial organizations, The first is state company of …………… , the other is the state company of ……
The problem of the research determines some questions and hypotheses, The data was Collected by question air, which contains 5 dimensions and (10) critical Successful factories .
The sample contains (42) who Works in that organizations. The points out many conclusions. The main of it, there is significant differences among the two organizations Then The research concluded with a number of important recommendations serve it's objectives .
... Show MoreWithin the framework of big data, energy issues are highly significant. Despite the significance of energy, theoretical studies focusing primarily on the issue of energy within big data analytics in relation to computational intelligent algorithms are scarce. The purpose of this study is to explore the theoretical aspects of energy issues in big data analytics in relation to computational intelligent algorithms since this is critical in exploring the emperica aspects of big data. In this chapter, we present a theoretical study of energy issues related to applications of computational intelligent algorithms in big data analytics. This work highlights that big data analytics using computational intelligent algorithms generates a very high amo
... Show More