Within the framework of big data, energy issues are highly significant. Despite the significance of energy, theoretical studies focusing primarily on the issue of energy within big data analytics in relation to computational intelligent algorithms are scarce. The purpose of this study is to explore the theoretical aspects of energy issues in big data analytics in relation to computational intelligent algorithms since this is critical in exploring the emperica aspects of big data. In this chapter, we present a theoretical study of energy issues related to applications of computational intelligent algorithms in big data analytics. This work highlights that big data analytics using computational intelligent algorithms generates a very high amount of energy, especially during the training phase. The transmission of big data between service providers, users and data centres emits carbon dioxide as a result of high power consumption. This chapter proposes a theoretical framework for big data analytics using computational intelligent algorithms that has the potential to reduce energy consumption and enhance performance. We suggest that researchers should focus more attention on the issue of energy within big data analytics in relation to computational intelligent algorithms, before this becomes a widespread and urgent problem.
The modern systems that have been based upon the hash function are more suitable compared to the conventional systems; however, the complicated algorithms for the generation of the invertible functions have a high level of time consumption. With the use of the GAs, the key strength is enhanced, which results in ultimately making the entire algorithm sufficient. Initially, the process of the key generation is performed by using the results of n-queen problem that is solved by the genetic algorithm, with the use of a random number generator and through the application of the GA operations. Ultimately, the encryption of the data is performed with the use of the Modified Reverse Encryption Algorithm (MREA). It was noticed that the
... Show MoreDensity functional theory calculations are employed to investigate the impact of edifenphos molecule on the reactivity and electronic sensitivity of pure calcium oxide (CaO) nanocluster. The strong adsorption of edifenphos molecule on CaO nanocluster occurs by the sulfur head of the adsorbate, and the amount of the energy of this adsorption is around − 84.40 kcal/mol. The adsorption of edifenphos molecules results in a decrease in the values of Eg of CaO from 4.67 to 3.56 eV, as well as an increase in the electrical conductance. Moreover, the work function of CaO nanocluster is significantly affected, which changes the current of the field emission electron. Eventually, the recovery time is calculated around 99 ms at ambient temperature f
... Show MoreAbstract:
Research Topic: Ruling on the sale of big data
Its objectives: a statement of what it is, importance, source and governance.
The methodology of the curriculum is inductive, comparative and critical
One of the most important results: it is not permissible to attack it and it is a valuable money, and it is permissible to sell big data as long as it does not contain data to users who are not satisfied with selling it
Recommendation: Follow-up of studies dealing with the provisions of the issue
Subject Terms
Judgment, Sale, Data, Mega, Sayings, Jurists
Tor (The Onion Routing) network was designed to enable users to browse the Internet anonymously. It is known for its anonymity and privacy security feature against many agents who desire to observe the area of users or chase users’ browsing conventions. This anonymity stems from the encryption and decryption of Tor traffic. That is, the client’s traffic should be subject to encryption and decryption before the sending and receiving process, which leads to delay and even interruption in data flow. The exchange of cryptographic keys between network devices plays a pivotal and critical role in facilitating secure communication and ensuring the integrity of cryptographic procedures. This essential process is time-consuming, which causes del
... Show MoreIn this research, a simple experiment in the field of agriculture was studied, in terms of the effect of out-of-control noise as a result of several reasons, including the effect of environmental conditions on the observations of agricultural experiments, through the use of Discrete Wavelet transformation, specifically (The Coiflets transform of wavelength 1 to 2 and the Daubechies transform of wavelength 2 To 3) based on two levels of transform (J-4) and (J-5), and applying the hard threshold rules, soft and non-negative, and comparing the wavelet transformation methods using real data for an experiment with a size of 26 observations. The application was carried out through a program in the language of MATLAB. The researcher concluded that
... Show MoreThis paper proposes a new encryption method. It combines two cipher algorithms, i.e., DES and AES, to generate hybrid keys. This combination strengthens the proposed W-method by generating high randomized keys. Two points can represent the reliability of any encryption technique. Firstly, is the key generation; therefore, our approach merges 64 bits of DES with 64 bits of AES to produce 128 bits as a root key for all remaining keys that are 15. This complexity increases the level of the ciphering process. Moreover, it shifts the operation one bit only to the right. Secondly is the nature of the encryption process. It includes two keys and mixes one round of DES with one round of AES to reduce the performance time. The W-method deals with
... Show More
Theoretical spectroscopic studies of beryllium oxide has been carried out, potential energy curves for ground states X1Σ+ and exited states A1Π , B1Σ+ by using two functions Morse and and Varshni compared with experimental results. The potentials of this molecule are agreement with experimental results. The Fortrat Parabola corrcponding to and branches were determind in the range 1<J<20 for the (0-0) band. It was found that for electronic transition A1Π- X1Σ+ the bands head lies in branche of Fortrat p |
This paper presents a comparative study of two learning algorithms for the nonlinear PID neural trajectory tracking controller for mobile robot in order to follow a pre-defined path. As simple and fast tuning technique, genetic and particle swarm optimization algorithms are used to tune the nonlinear PID neural controller's parameters to find the best velocities control actions of the right wheel and left wheel for the real mobile robot. Polywog wavelet activation function is used in the structure of the nonlinear PID neural controller. Simulation results (Matlab) and experimental work (LabVIEW) show that the proposed nonlinear PID controller with PSO
learning algorithm is more effective and robust than genetic learning algorithm; thi
Background: The pharmaceutical “marketplace” is booming with a plethora of over-the-counter products known as supplements, including regulated and unregulated chemicals that can be purchased from pharmacies, the black market, or via the internet, including the infamous deep web and the darknet. Some supplements can boost specific physiological functions, including physical endurance, sexual performance, and musculoskeletal-articular rejuvenation.
Case Report: We are reporting a case of acute metabolic disturbances that materialized following the ingestion of CH-alpha. An otherwise healthy 35 years old male from Iraq, manifested with bilateral periorbital and dependent leg edema. Biochemical p
... Show MoreThe influx of data in bioinformatics is primarily in the form of DNA, RNA, and protein sequences. This condition places a significant burden on scientists and computers. Some genomics studies depend on clustering techniques to group similarly expressed genes into one cluster. Clustering is a type of unsupervised learning that can be used to divide unknown cluster data into clusters. The k-means and fuzzy c-means (FCM) algorithms are examples of algorithms that can be used for clustering. Consequently, clustering is a common approach that divides an input space into several homogeneous zones; it can be achieved using a variety of algorithms. This study used three models to cluster a brain tumor dataset. The first model uses FCM, whic
... Show More