Nowadays, information systems constitute a crucial part of organizations; by losing security, these organizations will lose plenty of competitive advantages as well. The core point of information security (InfoSecu) is risk management. There are a great deal of research works and standards in security risk management (ISRM) including NIST 800-30 and ISO/IEC 27005. However, only few works of research focus on InfoSecu risk reduction, while the standards explain general principles and guidelines. They do not provide any implementation details regarding ISRM; as such reducing the InfoSecu risks in uncertain environments is painstaking. Thus, this paper applied a genetic algorithm (GA) for InfoSecu risk reduction in uncertainty. Finally, the effectiveness of the applied method was verified through an example.
This paper aims to evaluate the reliability analysis for steel beam which represented by the probability of Failure and reliability index. Monte Carlo Simulation Method (MCSM) and First Order Reliability Method (FORM) will be used to achieve this issue. These methods need two samples for each behavior that want to study; the first sample for resistance (carrying capacity R), and second for load effect (Q) which are parameters for a limit state function. Monte Carlo method has been adopted to generate these samples dependent on the randomness and uncertainties in variables. The variables that consider are beam cross-section dimensions, material property, beam length, yield stress, and applied loads. Matlab software has be
... Show MoreInformation systems and data exchange between government institutions are growing rapidly around the world, and with it, the threats to information within government departments are growing. In recent years, research into the development and construction of secure information systems in government institutions seems to be very effective. Based on information system principles, this study proposes a model for providing and evaluating security for all of the departments of government institutions. The requirements of any information system begin with the organization's surroundings and objectives. Most prior techniques did not take into account the organizational component on which the information system runs, despite the relevance of
... Show MoreWireless lietworking is· constantly improving, changing and
though ba ic principle is the same. ['nstead of using standard cables to transmit information fmm one point to another (qr more), it .uses radio signals. This paper presents .a case study considedng real-time remote
cqntroJ using Wireless UDP/JP-based networks,. The aim of-this werk is to
reduce real-time· remote control system based upon a simulatio.n model,
which can operate via general communication l"]etworks, whieh on bodies. modern wireles tcchnolqgy.
The first part includes· a brief
... Show MoreCatalytic reduction is considered an effective approach for the reduction of toxic organic pollutants from the environment, but finding an active catalyst is still a big challenge. Herein, Ag decorated CeO2 catalyst was synthesized through polyol reduction method and applied for catalytic reduction (conversion) of 4-nitrophenol (4-NP) to 4-aminophenol (4-AP). The Ag decorated CeO2 catalyst displayed an outstanding reduction activity with 99% conversion of 4-NP in 5 min with a 0.61 min−1 reaction rate (k). A number of structural characterization techniques were executed to investigate the influence of Ag on CeO2 and its effect on the catalytic conversion of 4-NP. The outstanding catalytic performances of the Ag-CeO2 catalyst can be assigne
... Show MoreBackground: Asthma is one of the most common chronic respiratory diseases in the world, standing for the most frequent cause for hospitalization and emergency cases. Respiratory viruses are the most triggering cause. Aim: To assess the role of viral infections, especially COVID-19, in the pathogenesis of asthma initiation and exacerbations. Method: Electronic search was done for the manuscripts focusing on asthma as a risk factor for complications after COVID-19 infection. The outcomes were titles, materials, methods and classified studies related or not related to the review study. Three hundred publications were identified and only ten studies were selected for analysis. Seven studies were review, one retrospective, one longitudin
... Show MoreAn Optimal Algorithm for HTML Page Building Process
Regression testing being expensive, requires optimization notion. Typically, the optimization of test cases results in selecting a reduced set or subset of test cases or prioritizing the test cases to detect potential faults at an earlier phase. Many former studies revealed the heuristic-dependent mechanism to attain optimality while reducing or prioritizing test cases. Nevertheless, those studies were deprived of systematic procedures to manage tied test cases issue. Moreover, evolutionary algorithms such as the genetic process often help in depleting test cases, together with a concurrent decrease in computational runtime. However, when examining the fault detection capacity along with other parameters, is required, the method falls sh
... Show MoreThis paper proposes a novel meta-heuristic optimization algorithm called the fine-tuning meta-heuristic algorithm (FTMA) for solving global optimization problems. In this algorithm, the solutions are fine-tuned using the fundamental steps in meta-heuristic optimization, namely, exploration, exploitation, and randomization, in such a way that if one step improves the solution, then it is unnecessary to execute the remaining steps. The performance of the proposed FTMA has been compared with that of five other optimization algorithms over ten benchmark test functions. Nine of them are well-known and already exist in the literature, while the tenth one is proposed by the authors and introduced in this article. One test trial was shown t
... Show MoreIn many video and image processing applications, the frames are partitioned into blocks, which are extracted and processed sequentially. In this paper, we propose a fast algorithm for calculation of features of overlapping image blocks. We assume the features are projections of the block on separable 2D basis functions (usually orthogonal polynomials) where we benefit from the symmetry with respect to spatial variables. The main idea is based on a construction of auxiliary matrices that virtually extends the original image and makes it possible to avoid a time-consuming computation in loops. These matrices can be pre-calculated, stored and used repeatedly since they are independent of the image itself. We validated experimentally th
... Show More