The article describes a certain computation method of -arcs to construct the number of distinct -arcs in for . In this method, a new approach employed to compute the number of -arcs and the number of distinct arcs respectively. This approach is based on choosing the number of inequivalent classes } of -secant distributions that is the number of 4-secant, 3-secant, 2-secant, 1-secant and 0-secant in each process. The maximum size of -arc that has been constructed by this method is . The new method is a new tool to deal with the programming difficulties that sometimes may lead to programming problems represented by the increasing number of arcs. It is essential to reduce the established number of -arcs in each construction especially for large value of and then reduce the running time of the calculation. Therefore, it allows to decrease the memory storage for the calculation processes. This method’s effectiveness evaluation is confirmed by the results of the calculation where a largest size of complete -arc is constructed. This research’s calculation results develop the strategy of the computational approaches to investigate big sizes of arcs in where it put more attention to the study of the number of the inequivalent classes of -secants of -arcs in which is an interesting aspect. Consequently, it can be used to establish a large value of .
In this paper, the botnet detection problem is defined as a feature selection problem and the genetic algorithm (GA) is used to search for the best significant combination of features from the entire search space of set of features. Furthermore, the Decision Tree (DT) classifier is used as an objective function to direct the ability of the proposed GA to locate the combination of features that can correctly classify the activities into normal traffics and botnet attacks. Two datasets namely the UNSW-NB15 and the Canadian Institute for Cybersecurity Intrusion Detection System 2017 (CICIDS2017), are used as evaluation datasets. The results reveal that the proposed DT-aware GA can effectively find the relevant features from
... Show MoreIn recent years, with the rapid development of the current classification system in digital content identification, automatic classification of images has become the most challenging task in the field of computer vision. As can be seen, vision is quite challenging for a system to automatically understand and analyze images, as compared to the vision of humans. Some research papers have been done to address the issue in the low-level current classification system, but the output was restricted only to basic image features. However, similarly, the approaches fail to accurately classify images. For the results expected in this field, such as computer vision, this study proposes a deep learning approach that utilizes a deep learning algorithm.
... Show MoreBackground. Body mass index (BMI) is a person's weight in kilograms (or pounds) divided by the square of height in meters (or feet). Obesity affects a wide spectrum of age groups, from the young to the elderly, and there are several eye diseases related to obesity like diabetic retinopathy, floppy eyelid syndrome, retinal vein occlusion, stroke-related vision loss, age-related macular degeneration, and possibly, refractive errors. Refractive errors (RE) are optical imperfections related to the focusing ability of the eye and are the main cause of visual impairment which may result in missed education and employment opportunities, lower productivity and impaired quality of life. Aim. The study aimed to find an association between bod
... Show MoreThe sensitive and important data are increased in the last decades rapidly, since the tremendous updating of networking infrastructure and communications. to secure this data becomes necessary with increasing volume of it, to satisfy securing for data, using different cipher techniques and methods to ensure goals of security that are integrity, confidentiality, and availability. This paper presented a proposed hybrid text cryptography method to encrypt a sensitive data by using different encryption algorithms such as: Caesar, Vigenère, Affine, and multiplicative. Using this hybrid text cryptography method aims to make the encryption process more secure and effective. The hybrid text cryptography method depends on circular queue. Using circ
... Show MoreCurrently, with the huge increase in modern communication and network applications, the speed of transformation and storing data in compact forms are pressing issues. Daily an enormous amount of images are stored and shared among people every moment, especially in the social media realm, but unfortunately, even with these marvelous applications, the limited size of sent data is still the main restriction's, where essentially all these applications utilized the well-known Joint Photographic Experts Group (JPEG) standard techniques, in the same way, the need for construction of universally accepted standard compression systems urgently required to play a key role in the immense revolution. This review is concerned with Different
... Show MoreCredit card fraud has become an increasing problem due to the growing reliance on electronic payment systems and technological advances that have improved fraud techniques. Numerous financial institutions are looking for the best ways to leverage technological advancements to provide better services to their end users, and researchers used various protection methods to provide security and privacy for credit cards. Therefore, it is necessary to identify the challenges and the proposed solutions to address them. This review provides an overview of the most recent research on the detection of fraudulent credit card transactions to protect those transactions from tampering or improper use, which includes imbalance classes, c
... Show MoreIn this study an experimental work was done to study the possibility of using aluminum rubbish material as a coagulant to remove the colloidal particles from oily wastewater by dissolving this rubbish in sodium hydroxide solution. The experiments were carried out on simulated oily wastewater that was prepared at different oil concentrations and hardness levels (50, 250, 500, and 1000) ppm oil for (2000, 2500, 3000, and 3500) ppm CaCo3 respectively. The initial turbidity values were (203, 290, 770, and 1306) NTU, while the minimum values of turbidity that have been gained from the experiments in NTU units were (1.67, 1.95, 2.10, and 4.01) at best sodium aluminate dosages in milliliters (12, 20, 24, and 28) for
... Show MoreAn excellent reputation earned by initiating and practicing sustainable business practices has additional benefits, of which are reducing environmental incidents and an improvement in operational efficiency as this has the potential to help firms improve on productivity and bring down operating costs. Taken further, with ever-increasing socially and environmentally-conscious investors and the public alike, this act of natural resources management could have a significant implication on market value and income of the practicing firms.
The above proposition has been supported by sustainable business practices literature that is continuously conversing and deliberating upon the impact of efficient resource d
... Show MoreIn light of accelerating environmental degradation, the transition to a green economy is an imperative for achieving sustainable development. This study provides a critical analysis of the international legal and institutional framework governing this transition, revealing a significant gap between normative developments and the institutional framework on one hand, and their practical implementation on the other. The transition faces legal obstacles, including reliance on non-binding voluntary commitments and conflicts between environmental obligations and global trade and investment rules. It also reveals a significant financing gap, as financial flows to developing countries continue to lag behind commitments, in add
... Show More