The evolution of cryptography has been crucial to preservation subtle information in the digital age. From early cipher algorithms implemented in earliest societies to recent cryptography methods, cryptography has developed alongside developments in computing field. The growing in cyber threats and the increase of comprehensive digital communications have highlighted the significance of selecting effective and robust cryptographic techniques. This article reviews various cryptography algorithms, containing symmetric key and asymmetric key cryptography, via evaluating them according to security asset, complexity, and execution speed. The main outcomes demonstrate the growing trust on elliptic curve cryptography outstanding its capability and small size, while highlighting the requirement for study in the post-quantum cryptographic field to address the threats rising from quantum computing. The comparative analysis shows a comprehensive understanding that combines classical cryptography algorithms with up-to-date approaches such as chaotic-based system and post-quantum cryptography, confirming that the study addresses the future of cryptography security in the aspect of emerging challenge like quantum computing.
This research aims to test the relationship between "relational leadership as an independent variable and organizational energy as a dependent variable. The current research variables are among the recent and important variables for the development of organizations, and for the purpose of explaining the relationship and influence between the variables, a set of goals has been formulated, including providing the interested and scientific and theoretical information explaining the nature of the variables The research, and the extent to which its causes are reflected in the research sample to increase the interest of the research organization’s organization and make it more appropriate to the required performance in light of a cha
... Show MoreThis research dealt with the analysis of murder crime data in Iraq in its temporal and spatial dimensions, then it focused on building a new model with an algorithm that combines the characteristics associated with time and spatial series so that this model can predict more accurately than other models by comparing them with this model, which we called the Combined Regression model (CR), which consists of merging two models, the time series regression model with the spatial regression model, and making them one model that can analyze data in its temporal and spatial dimensions. Several models were used for comparison with the integrated model, namely Multiple Linear Regression (MLR), Decision Tree Regression (DTR), Random Forest Reg
... Show MoreThe applications of Multilevel Converter (MLC) are increased because of the huge demand for clean power; especially these types of converters are compatible with the renewable energy sources. In addition, these new types of converters have the capability of high voltage and high power operation. A Nine-level converter in three modes of implementation; Diode Clamped-MLC (DC-MLC), Capacitor Clamped-MLC (CC-MLC), and the Modular Structured-MLC (MS-MLC) are analyzed and simulated in this paper. Various types of Multicarrier Modulation Techniques (MMTs) (Level shifted (LS), and Phase shifted (PS)) are used for operating the proposed Nine level - MLCs. Matlab/Simulink environment is used for the simulation, extracting, and ana
... Show MoreThe evolution of the Internet of things (IoT) led to connect billions of heterogeneous physical devices together to improve the quality of human life by collecting data from their environment. However, there is a need to store huge data in big storage and high computational capabilities. Cloud computing can be used to store big data. The data of IoT devices is transferred using two types of protocols: Message Queuing Telemetry Transport (MQTT) and Hypertext Transfer Protocol (HTTP). This paper aims to make a high performance and more reliable system through efficient use of resources. Thus, load balancing in cloud computing is used to dynamically distribute the workload across nodes to avoid overloading any individual r
... Show MoreDrought is a natural phenomenon in many arid, semi-arid, or wet regions. This showed that no region worldwide is excluded from the occurrence of drought. Extreme droughts were caused by global weather warming and climate change. Therefore, it is essential to review the studies conducted on drought to use the recommendations made by the researchers on drought. The drought was classified into meteorological, agricultural, hydrological, and economic-social. In addition, researchers described the severity of the drought by using various indices which required different input data. The indices used by various researchers were the Joint Deficit Index (JDI), Effective Drought Index (EDI), Streamflow Drought Index (SDI), Sta
... Show MoreThe occurrences of invasive candidiasis has increased over the previous few decades. Although Candida albicans considers as one of the most common species of organisms, that cause acquired fungal infections. Candida albicans is an opportunistic fungal pathogen and inherent in as a lifelong, the yeast is present in healthy individuals as a commensal, and can reside harmlessly in human body. However, in immuno-compromised individuals, the fungus can invade tissues, producing superficial infections and, in severe cases, life-threatening systemic infections. This review wills emphasis on virulence factor of C. albicans including (adhesion, invasion, candida proteinase, and phenotypic switching and biofilm formation. I
... Show MoreMost of the Internet of Things (IoT), cell phones, and Radio Frequency Identification (RFID) applications need high speed in the execution and processing of data. this is done by reducing, system energy consumption, latency, throughput, and processing time. Thus, it will affect against security of such devices and may be attacked by malicious programs. Lightweight cryptographic algorithms are one of the most ideal methods Securing these IoT applications. Cryptography obfuscates and removes the ability to capture all key information patterns ensures that all data transfers occur Safe, accurate, verified, legal and undeniable. Fortunately, various lightweight encryption algorithms could be used to increase defense against various at
... Show MoreMalicious software (malware) performs a malicious function that compromising a computer system’s security. Many methods have been developed to improve the security of the computer system resources, among them the use of firewall, encryption, and Intrusion Detection System (IDS). IDS can detect newly unrecognized attack attempt and raising an early alarm to inform the system about this suspicious intrusion attempt. This paper proposed a hybrid IDS for detection intrusion, especially malware, with considering network packet and host features. The hybrid IDS designed using Data Mining (DM) classification methods that for its ability to detect new, previously unseen intrusions accurately and automatically. It uses both anomaly and misuse dete
... Show MoreThe expanding use of multi-processor supercomputers has made a significant impact on the speed and size of many problems. The adaptation of standard Message Passing Interface protocol (MPI) has enabled programmers to write portable and efficient codes across a wide variety of parallel architectures. Sorting is one of the most common operations performed by a computer. Because sorted data are easier to manipulate than randomly ordered data, many algorithms require sorted data. Sorting is of additional importance to parallel computing because of its close relation to the task of routing data among processes, which is an essential part of many parallel algorithms. In this paper, sequential sorting algorithms, the parallel implementation of man
... Show Moresummary
In this search, we examined the factorial experiments and the study of the significance of the main effects, the interaction of the factors and their simple effects by the F test (ANOVA) for analyze the data of the factorial experience. It is also known that the analysis of variance requires several assumptions to achieve them, Therefore, in case of violation of one of these conditions we conduct a transform to the data in order to match or achieve the conditions of analysis of variance, but it was noted that these transfers do not produce accurate results, so we resort to tests or non-parametric methods that work as a solution or alternative to the parametric tests , these method
... Show More