Cryptography algorithms play a critical role in information technology against various attacks witnessed in the digital era. Many studies and algorithms are done to achieve security issues for information systems. The high complexity of computational operations characterizes the traditional cryptography algorithms. On the other hand, lightweight algorithms are the way to solve most of the security issues that encounter applying traditional cryptography in constrained devices. However, a symmetric cipher is widely applied for ensuring the security of data communication in constraint devices. In this study, we proposed a hybrid algorithm based on two cryptography algorithms PRESENT and Salsa20. Also, a 2D logistic map of a chaotic system is applied to generate pseudo-random keys that produce more complexity for the proposed cipher algorithm. The goal of the proposed algorithm is to present a hybrid algorithm by enhancing the complexity of the current PRESENT algorithm while keeping the performance of computational operations as minimal. The proposed algorithm proved working efficiently with fast executed time, and the analyzed result of the generated sequence keys passed the randomness of the NIST suite.
The ability of the human brain to communicate with its environment has become a reality through the use of a Brain-Computer Interface (BCI)-based mechanism. Electroencephalography (EEG) has gained popularity as a non-invasive way of brain connection. Traditionally, the devices were used in clinical settings to detect various brain diseases. However, as technology advances, companies such as Emotiv and NeuroSky are developing low-cost, easily portable EEG-based consumer-grade devices that can be used in various application domains such as gaming, education. This article discusses the parts in which the EEG has been applied and how it has proven beneficial for those with severe motor disorders, rehabilitation, and as a form of communi
... Show MoreUse Almtafr axis to study the response component that isolates gave a positive response to the use of standard Almtafr which leads to lower the temperature and the number of cells at a temperature suitable Rifampicin resistant less than that TJ and similarly reflected on the frequency of mutations induced
To expedite the learning process, a group of algorithms known as parallel machine learning algorithmscan be executed simultaneously on several computers or processors. As data grows in both size andcomplexity, and as businesses seek efficient ways to mine that data for insights, algorithms like thesewill become increasingly crucial. Data parallelism, model parallelism, and hybrid techniques are justsome of the methods described in this article for speeding up machine learning algorithms. We alsocover the benefits and threats associated with parallel machine learning, such as data splitting,communication, and scalability. We compare how well various methods perform on a variety ofmachine learning tasks and datasets, and we talk abo
... Show MoreBy optimizing the efficiency of a modular simulation model of the PV module structure by genetic algorithm, under several weather conditions, as a portion of recognizing the ideal plan of a Near Zero Energy Household (NZEH), an ideal life cycle cost can be performed. The optimum design from combinations of NZEH-variable designs, are construction positioning, window-to-wall proportion, and glazing categories, which will help maximize the energy created by photovoltaic panels. Comprehensive simulation technique and modeling are utilized in the solar module I-V and for P-V output power. Both of them are constructed on the famous five-parameter model. In addition, the efficiency of the PV panel is established by the genetic algorithm
... Show MoreCombining different treatment strategies successively or simultaneously has become recommended to achieve high purification standards for the treated discharged water. The current work focused on combining electrocoagulation, ion-exchange, and ultrasonication treatment approaches for the simultaneous removal of copper, nickel, and zinc ions from water. The removal of the three studied ions was significantly enhanced by increasing the power density (4–10 mA/cm2) and NaCl salt concentration (0.5–1.5 g/L) at a natural solution pH. The simultaneous removal of these metal ions at 4 mA/cm2 and 1 g NaCl/L was highly improved by introducing 1 g/L of mordenite zeolite as an ion-exchanger. A remarkable removal of heavy metals was reported
... Show MoreCombining different treatment strategies successively or simultaneously has become recommended to achieve high purification standards for the treated discharged water. The current work focused on combining electrocoagulation, ion-exchange, and ultrasonication treatment approaches for the simultaneous removal of copper, nickel, and zinc ions from water. The removal of the three studied ions was significantly enhanced by increasing the power density (4–10 mA/cm2) and NaCl salt concentration (0.5–1.5 g/L) at a natural solution pH. The simultaneous removal of these metal ions at 4 mA/cm2 and 1 g NaCl/L was highly improved by introducing 1 g/L of mordenite zeolite as an ion-exchanger. A remarkable removal of heavy metals was reported
... Show MoreThe propagation of laser beam in the underdense deuterium plasma has been studied via computer simulation using the fluid model. An appropriate computer code “HEATER” has been modified and is used for this purpose. The propagation is taken to be in a cylindrical symmetric medium. Different laser wavelengths (1 = 10.6 m, 2 = 1.06 m, and 3 = 0.53 m) with a Gaussian pulse type and 15 ns pulse widths have been considered. Absorption energy and laser flux have been calculated for different plasma and laser parameters. The absorbed laser energy showed maximum for = 0.53 m. This high absorbitivity was inferred to the effect of the pondermotive force.
In this research, the covariance estimates were used to estimate the population mean in the stratified random sampling and combined regression estimates. were compared by employing the robust variance-covariance matrices estimates with combined regression estimates by employing the traditional variance-covariance matrices estimates when estimating the regression parameter, through the two efficiency criteria (RE) and mean squared error (MSE). We found that robust estimates significantly improved the quality of combined regression estimates by reducing the effect of outliers using robust covariance and covariance matrices estimates (MCD, MVE) when estimating the regression parameter. In addition, the results of the simulation study proved
... Show More