Nowadays, information systems constitute a crucial part of organizations; by losing security, these organizations will lose plenty of competitive advantages as well. The core point of information security (InfoSecu) is risk management. There are a great deal of research works and standards in security risk management (ISRM) including NIST 800-30 and ISO/IEC 27005. However, only few works of research focus on InfoSecu risk reduction, while the standards explain general principles and guidelines. They do not provide any implementation details regarding ISRM; as such reducing the InfoSecu risks in uncertain environments is painstaking. Thus, this paper applied a genetic algorithm (GA) for InfoSecu risk reduction in uncertainty. Finally, the effectiveness of the applied method was verified through an example.
Instruments for the measurements of radon, thoron and its decay
products in air are based mostly on the detection of alpha particles.
The health hazards of radon on general public are well known. In
order to understand the level and distribution of 222Rn concentrations
indoor in Al-Fallujah City; new technique was used, this technique
was three radon–thoron mixed field dosimeters is made up of a twin
chamber cylindrical system and three LR-115 type II detectors were
employed. The aim of this work was to measurement radon gas using
SSNTD technique door in in Al-Fallujah City, and estimation of
excess in cancer due to increment in radon gas. Results for samples
which are collected from January to
The main function of a power system is to supply the customer load demands as economically as possible. Risk criterion is the probability of not meeting the load. This paper presents a methodology to assess probabilistic risk criteria of Al-Qudus plant before and after expansion; as this plant consists of ten generating units presently and the Ministry Of Electricity (MOE) is intending to compact four units to it in order to improve the performance of Iraqi power system especially at Baghdad region. The assessment is calculated by a program using Matlab programming language; version 7.6. Results show that the planned risk is (0.003095) that is (35 times) less than that in the present plant risk; (0.1091); which represents respectable imp
... Show MoreTen isolates were collected from different clinical sources from laboratory in medicine century . These isolates were belonging to the genus Salmonella depending on morphological and biochemical tests . The antibiotic scussptibility tests against 10 antibiotics were examined , and it was found that the 60% isolates have multiple resistant to antibiotic ,(70%) of isolates were resistant to ampicillin,(50%) were resistant to augmentin ,(40%) were resistant to ceftriaxone ,(20%) were resistant to cefotaxime and (10%) were resistant to ciprofloxacin and tetracycline while all isolates showed sensitivity to piperacillin, imipenem, amikacin and erythromycin .The ability of Salmonela isolates to produce ?-lactamase enzymes were tested usin
... Show MoreThe aim of this paper is to design a PID controller based on an on-line tuning bat optimization algorithm for the step-down DC/DC buck converter system which is used in the battery operation of the mobile applications. In this paper, the bat optimization algorithm has been utilized to obtain the optimal parameters of the PID controller as a simple and fast on-line tuning technique to get the best control action for the system. The simulation results using (Matlab Package) show the robustness and the effectiveness of the proposed control system in terms of obtaining a suitable voltage control action as a smooth and unsaturated state of the buck converter input voltage of ( ) volt that will stabilize the buck converter sys
... Show MoreThis paper deals with proposing new lifting scheme (HYBRID Algorithm) that is capable of preventing images and documents which are fraud through decomposing there in to the real colors value arrays (red, blue and green) to create retrieval keys for its properties and store it in the database and then check the document originality by retrieve the query image or document through the decomposition described above and compare the predicted color values (retrieval keys) of the query document with those stored in the database. The proposed algorithm has been developed from the two known lifting schemes (Haar and D4) by merging them to find out HYBRID lifting scheme. The validity and accuracy of the proposed algorithm have been ev
... Show MoreIdentification of complex communities in biological networks is a critical and ongoing challenge since lots of network-related problems correspond to the subgraph isomorphism problem known in the literature as NP-hard. Several optimization algorithms have been dedicated and applied to solve this problem. The main challenge regarding the application of optimization algorithms, specifically to handle large-scale complex networks, is their relatively long execution time. Thus, this paper proposes a parallel extension of the PSO algorithm to detect communities in complex biological networks. The main contribution of this study is summarized in three- fold; Firstly, a modified PSO algorithm with a local search operator is proposed
... Show MoreThe rapid and enormous growth of the Internet of Things, as well as its widespread adoption, has resulted in the production of massive quantities of data that must be processed and sent to the cloud, but the delay in processing the data and the time it takes to send it to the cloud has resulted in the emergence of fog, a new generation of cloud in which the fog serves as an extension of cloud services at the edge of the network, reducing latency and traffic. The distribution of computational resources to minimize makespan and running costs is one of the disadvantages of fog computing. This paper provides a new approach for improving the task scheduling problem in a Cloud-Fog environme
The performance quality and searching speed of Block Matching (BM) algorithm are affected by shapes and sizes of the search patterns used in the algorithm. In this paper, Kite Cross Hexagonal Search (KCHS) is proposed. This algorithm uses different search patterns (kite, cross, and hexagonal) to search for the best Motion Vector (MV). In first step, KCHS uses cross search pattern. In second step, it uses one of kite search patterns (up, down, left, or right depending on the first step). In subsequent steps, it uses large/small Hexagonal Search (HS) patterns. This new algorithm is compared with several known fast block matching algorithms. Comparisons are based on search points and Peak Signal to Noise Ratio (PSNR). According to resul
... Show MoreSpraying pesticides is one of the most common procedures that is conducted to control pests. However, excessive use of these chemicals inversely affects the surrounding environments including the soil, plants, animals, and the operator itself. Therefore, researchers have been encouraged to...
String matching is seen as one of the essential problems in computer science. A variety of computer applications provide the string matching service for their end users. The remarkable boost in the number of data that is created and kept by modern computational devices influences researchers to obtain even more powerful methods for coping with this problem. In this research, the Quick Search string matching algorithm are adopted to be implemented under the multi-core environment using OpenMP directive which can be employed to reduce the overall execution time of the program. English text, Proteins and DNA data types are utilized to examine the effect of parallelization and implementation of Quick Search string matching algorithm on multi-co
... Show More